Skip to main content

Posts

Showing posts from 2017

.NET Core or .NET Framework together with Docker and Containers on Azure

In this post, I will try to find answers and guidance for technical people that needs to decide if they shall use .NET Core or .NET Framework inside Docker containers on Azure. The same answer is applicable for on-premises, because Docker runs in the same way on Azure or on-premises.

The directions from Microsoft related to this topic are clear. The default option shall be .NET Core. From this way, .NET Core is design in a such a way that is align with container concepts. For example the footprint was reduced drastically in comparison with .NET Framework.

One interesting fact that people do not know is related to the type of Windows image that you need to use when you use .NET Core or .NET Framework together with containers. When you use .NET Framework you need to use a Windows Server Core. This image is heavier than Windows Nano Server. This can have a direct impact at infrastructure and resources requirements. There are many things to say why Windows Nano Server is better, a few th…

Microsoft Azure MVP 2017-2018

Another year passed as Microsoft Azure MVP and I realize that this is the 6th year as Microsoft MVP. It is a pleasure to be part of MVP Community, that has extraordinary people that are ready to help and offer support to any community all around the world.

I’m honored and exited to be part of this great community for one more year!

IoT offer comparison: AWS vs Azure

Nowadays IoT is appealing for everyone. These opportunities made the two biggest cloud providers from the market (Amazon and Microsoft) to come up IoT platforms and solutions. The main purpose this this article is to compare the current solutions from features perspective and capabilities perspective.
The interesting thing that happened in the last few years is the way how IoT solution evolved. At the beginning the solutions were oriented around transport and communication, but now the IoT platforms evolved and are integrated with systems that runs on the edge and in the cloud, supporting the business needs.

X-Rays
Let’s take a look on the available solutions that Amazon and Microsoft offers. Both providers are offering a central Hub that is used to establish and facilitate a communication between devices and backend systems.
At device level, each provider is offering a set of package libraries that allows clients to integrate their devices with the communication platform faster. At ba…

Dynamic update of Azure Web Job time schedule

The topic of this post is simple:
How can I specify to an Azure Web Job the schedule using application configuration or another location?
Does the web job restart automatically in the moment when I change the time interval?

I decided to write about this, because even if you can find the solution on the internet, you need to invest some time on searching until you find the right documentation related to it (INameResolver).

Short version
A custom INameResolver is defined that based on a key name can read the configuration from any location. In the above example, the value is read from configuration file. Don’t forget to register the name resolver on JobHostConfiguration.
namespace WebJob.Schedule { class Program { static void Main(string[] args) { JobHostConfiguration config = new JobHostConfiguration(); config.NameResolver = new MagicResolver(); config.UseTimers(); JobHost host = new JobHost(config); host.RunAndBlock(); } private class …

Azure Data Lake and Replication mechanism cross Azure Regions

Context
Let’s imagine that we are working in an automotive company that collects telemetric data from their cars. A part of this data needs to be processed, stored and managed.
To be able to store data now and use it later on in time, you decided to go with Azure Data Lake, that is not limited on how much data you can store and allows you to plug any kind of processing system.
Requirements
After the architecture audit, because of legal constraints you are required to have a resiliency policy for disaster recovery. Even if in the same Azure Region there are 3 copies of data that are generated by Azure Data Lake, the legal constraints require you to have a resiliency policy.
Problem
Azure Data Lake makes 3 copies of data in the same Azure Region, but there is no support to replicated or backup content in a different Azure Region. You will need to define you own mechanism for this.

Available Solutions
We could fine many ways of doing this. There are 2-3 mechanism to do replication of Azure …

[Post-Event Event] Event Sourcing and CQRS | ITCamp Community Summer Lunch Event in Cluj

Today we had the first ITCamp Community Event during the lunch break. We decided to do this event at this time in day because it was the only available slot for our special guest Andrea Saltarello.
The talk was about CQRS and Event Sourcing and even if it was only for one hour, the session contained a lot of takeaways, not only from technical perspective, but also from costs and architecture point of view. A great comparison between different NoSQL and ESB systems was presented from Event Sourcing point of view.

There were almost 30 people that decided to transform their lunch to a geek lunch together with ITCamp Community. This event was possible with the support or our local sponsors, that made this event possible.


Below you can find pictures from the event. See you next time!




Azure Audit Logs and Retention Policies

Scope In today post we will talk about Azure Audit Logs and retention policies. Because retention policies might differ from one industry to another, different approaches are required.
Audit Logs From my past experience, I know that each company and department might understand a different thing when you say Audit Logs. I was involved in projects where when you tag a log as audit you would be required by law to keep the audit log for 20-25 years. In this context, I think that the first step for us is to define what is an Audit Log in Azure. In Azure, most of the audit logs can be an activity log or a deployment operation. The first one is close related to any write operation that happened on your Azure Resource (post, put, delete). Read operations are not considered activity logs – but don’t be disappointed, there are many Azure Services that provided monitoring mechanism for read operation also (for example Azure Storage). The second type of audits are the one generated during a dep…

[Community Event] Event Sourcing and CQRS | ITCamp Community Summer Lunch Event in Cluj

At the end of this month (July 24) we will have a special guest in Cluj-Napoca: Andrea Saltarello. The format of the event will be different from the previous ones. The event will take place during the lunch break at The Office and is free.
If you want to find more about the event you can check the following registration links. See you at the event.

Meetup: https://www.meetup.com/ITCamp-Community/events/241394189/
Eventbrite: https://www.eventbrite.com/e/event-sourcing-and-cqrs-itcamp-community-summer-lunch-event-in-cluj-tickets-35994003032
ITCamp Community blog: https://community.itcamp.ro/2017/07/itcamp-community-summer-lunch-event-cluj-event-sourcing-cqrs/

Official announcement:
Let's try a different kind of event this summer. I proposed to all of you to meet during the lunch break and have a talk about Event Sourcing and CQRS. There will be a special guest (Andrea Saltarello - Solution Architect at Managed Design) that will talk about his own experience on how we should manage …

Near real-time analytics for IoT Technician from the field - Azure Time Series Insights

Take a look around you and tell me if you see at least one smart devices capable to send data. There are big chances that you'll have around you more than one. In this moment I have around me a laptop, a Surface, my Fitbit, a SmartTV and a Raspberry PI that fully equipped with weather sensors.
You might say who cares about the data that are collected from them. Maybe nobody or just adds companies. If you would be on a production lines things would be different, you would like to be able to visualize this data from different perspective, analyze them and find way the production fluctuated in a specific day.

Timebound
Data that is collected from different sensors and devices can contain a lot of parameters like temperature, humidity, light and noise level. But in the end when we want to visualize this data, the time information will be the one that will be used on a chart to look at data.
Try to imagine a chart where you put only temperature and humidity information, excluding the ti…

TOGAF® 9 Certification - Architecture Resources for exam preparation

In the last 3 weeks I wasn't active anymore on my blog. This happened because I decided to certified myself as TOGAF 9.

What is TOGAF?
TOGAF is a an architecture framework (Open Group Architecture Framework) for enterprise architectures. The framework comes with support for designing, planning, implement, governance and support an enterprise information technology architecture.
The core of this framework is TOGAF ADM (Architecture Development Method) that describes the method for developing and managing the full lifecycle of an enterprise architecture.

Why TOGAF?
On the market we can find a lot of certificates and standards related to this subject. I decided to go with TOGAF because is one of the frameworks that stays on the foundation of any company when you talk about enterprise architecture.
Additional to this, it is well used in bank, healthcare and life science industry. In comparison with other certificates, you cannot take this exam from your own laptop. It is requested to go…

Part 2 - Overengineering of a cloud application

In the last post we looked over a cloud solution design to ingest small CSV files uploaded by users. This files were crunched by the system that would generate static reports based on the content. Nothing fancy or complex.
The NFR requirements are light, because the real business value stays in the generated reports:

Under 200 users worldwide Concurrency level is 10% (20 users online simultan) Less than 15 CSV uploaded in total per day Basic reporting functionality Current DB size 150MB (2M reporting entries) DB forecast for next 3 years is 1GB (20-25M reporting entries) CSV has up to 1000 entries (maximum 10 columns) The system that was design for this application was a state of the art system - salable, robust, containing all the current technology trends. But of course was over engineering, to powerful and to expensive. Now, the biggest concern was how we can reduce the running cost of the system with a minimal impact (development cost). One of the drivers was that we had to come up…

Part 1 - Overengineering of a cloud application

N-tier architecture is seen nowadays as an old. People tends to migrate to event-base or microservice architecture. It's very common to see people that decide an architecture based on the market trends, ignoring the requirements, business needs and budget.

When you combine this with Azure (cloud in general) you will end up easily with a microservice architecture that combines messaging systems and event-driven architectures. Of course, N-tier application has a lot of disadvantages, but when you have a simple web application, there is no sense to create a complex unicorn that will survive for 100 years.

I was shocked to review a solution that was deployed in Azure 2 months ago, that from architures point of view was beautiful, but from the running and development costs become a nightmare.
I will not go in details of the business requirements, but imagine a system that needs to display some static information, allow users to upload small CSV files that are consolidated in a reportin…

[Post Event] ITCamp Conference 2017 - Cluj-Napoca

What a week! ITCamp Conference took place in Cluj-Napoca Speakers all around the globe joined forces with ITCamp Community team and delivered high quality sessions. As in the last years, topics that are covered by ITCamp Conference were from all technologies - JavaScript to Containers, Azure to Raspberry, OOP to Machine Learning.
The speaker list is pretty long and I invite you to check it out. ITCamp Conference had Google employee's, Principal Program Managers from Microsoft and of course a lot of architects and deep technical people from the field.
Being part of such an event it is a delight. Having live high quality sessions in Cluj-Napoca it is a unique opportunity, offered by ITCamp Community each year.
In figures, ITCamp Community looks very interesting - more than 40 speakers, that deliver 40+ sessions during the two days of the conference to more than 500 attendees.

What a great conference! What a week! Great sessions, great speakers, wonderful people - all of them in one…

[Past Event] DevTalks Cluj-Napoca 2017

This week I was invited at DevTalks to talk about cloud infrastructure and how we can isolated a cloud network from public internet.
DevTalks, as a conference is at the 3rd edition. This year there were 6 track in parallel covering the megatrends of 2017.  It was a good conference, with great speakers and interesting sessions.
Below you can find content related to my session.

Title:
Network isolated inside a cloud environment
Abstract: 
It is possible to create a private network inside a cloud environment that is fully isolated from the external world? If you want to find out the response to this question that you should join the session.
Additional to this we will talk about how we can migrate existing infrastructure to cloud (partially or fully) persisting the same security level as you had before.
Slides:

Network isolated inside a cloud environment Radu Vunvulea DevTalks 2017 Cluj Romania from Radu Vunvulea
Pictures:



Azure Cosmos DB | The perfect place for device topology for world wide solutions

In the world of IoT, devices are distributed all around the world. The current systems that are now on the market are offering scalable and distributed systems for communication between devices and our backends.

Context
It is not something out of ordinary if we have an IoT solution distributed to 3 or 5 places around the globe. But behind this performant systems we need a storage solution flexible enough for different schemas but in the same time to be powerful enough to scale at whatever size we want.

Relational database are used often to store data where we have a stable schema. Having different devices across the globe request to have different schemas and data formats. Storing data in non-relational databases is more natural and simple. A key-value, graph or document database is many time more suitable for our needs in IoT scenarios then a relational database.

Current solutions
There are plenty solutions on the market, that are fast, powerful and easy to use. I expect that you heard…

Azure Key Vault | How Secrets and Keys are stored

I'm pretty sure that most of you heard about Azure Key Vault. If not I recommend to take a look over this page that describes in details how Azure Key Vault helps us as a safeguard for our application secrets and cryptographic keys (like certificates).

Scope
The main scope of this post is to take a look on how our secrets are stored. This is important because there are keys that cannot be recovered once generated or stored and we might end up without keys in the case we lose them.

What is HSM?
HSM is an acronym for Hardware Security Module. It is a physical device that can manage digital keys by providing cryptographic capabilities. HSM is playing the role of a safeguard by offering cryptographic capabilities directly by the hardware.

Is the tuple <keys, secrets> stored inside HSM?

No, there is no need to store this information in HSM. Secrets are stored outside the HSM, but they are encrypted using a key chain that terminates inside the HSM.
An analogy related to key chains an…