Skip to main content


Showing posts from January, 2017

Azure Regions products and services availability

Cloud is evolving. Nowadays we can say that cloud infrastructure is mature enough to support most of the business scenarios on the market.

This evolution changed also how we see cloud, where Data Centers (Regions) are available and who is the owner of them. Microsoft is the best example for this. In this moment Microsoft has 3 types of Azure Regions:
Hosted and managed by Microsoft - public available for all of usAzure Government Regions - exclusively available only for government (solution providers and customers that work with government)Azure "Private" Regions - special Azure Regions where Azure infrastructure runs, but controlled and owned by a data trustee like China or Germany
New times, new problems. In this moment in time, Microsoft have 32 Azure Regions and 6 additional already announced (coming soon). Each of Azure Region has his own update and delivery timeline. It means that when a new service is launched it will not be available in all 32 Azure Regions.

In-memory data protection (encryption)

In this post we'll talk about memory encryption and why we need to care about it.

With EU regulations, IT solutions needs to secure data content about citizens End to End (E2E). This means that we need to offer not only a secure storage, a secure connection between different nodes, encryption at payload level, but also we need to offer a secure mechanism to store sensitive information in memory.

You might say that this is not important if the servers where data is processed is secured. Yes in an ideal world. There is no bug free software.
This means that M&S (Maintenance and Support) team that has direct access to that server might need to do a memory dump. The memory dump shall be encrypted, shared only with a limited number of people and many more. They could even share the memory dump with external 3rd parties that offer support for other systems or software that are used on that machine.
For this we need to ensure that data privacy is respected.

If the server is …

Microservices and Team Setup - Is not only about code

Solutions that are based on microservices architecture become a commodity. Teams are not afraid anymore when they found out that the next projects is based on micro-service architecture.
The transition from messaging base architecture to microservices architectures was very useful for team to understand the main concept and how a solution should look like. 
Trigger Working with different frameworks or tools specific to microservices will make the team curious and happy at the beginning. Most of the people are happy when they have team allocated to learn new stuff. But after this phase, I identify that there is usually a collapse. In the first moment when we are ready to start 'production' code, people starts to have hiccups and each of them have problems when they need to make a deploy, run their code in containers and so on.
Cause It is easily to forgot or to ignore that microservices is not only at service/code level. It is also at infrastructure level. There is a lot of scrip…

[IoT Home Project] Part 7 - Read/Write data to device twin

In this post we will discover how to:
Tool that can be used to read reported properties from device twin and set desired propertiesRead how often sensor data are reported to backend from desired propertiesWrite how often sensor data are reported by device using reported properties Previous post: [IoT Home Project] Part 6 - Stream Analytics and Power BI
GitHub source code: 

What is device twin
It’s a collection of properties that specific for each individual device. Device twins contains 3 types of structure (where the last two are similar, but have different roles)
Tags: Device specific information that resides on the Azure IoT Hub and never reach the device Desire Properties: Properties set on Azure IoT Hub (by backend) and are delivered to the device Reported Properties: Properties set by Device and are delivered to Azure IoT Hub (to backend) You might find odd that there are 2 types of properties/structures in device twin – tags and desired/…

Azure IoT Hub and message priority

Azure IoT is evolving. As I sad last year, each week that pass new features are added to Azure IoT Hub that consolidate the service and help us to have a better connection with our devices.

I often seen requirements that specifies that there are alerts that needs to be send to/from device with higher priority than the rest of the messages. There is no perfect workaround and each use case might required a different approach. We will also discover that a new functionality offered by Azure IoT Hub covers one of the flows. There are two main flows where priority messages may be required: Device To Backend Communication (Device2Cloud -D2C)Backend To Device Communication (Cloud2Device - C2D) We would discuss each flow separately. At the end end of the post we will map the workarounds and the current solutions on the flows.
Message priority on Device To Backend Communication The use case is not complex. We need to be able to send messages to device to our backend with different priorities. For…

[IoT Home Project] Part 6 - Stream Analytics and Power BI

In this post we will discover how to: Store all sensor data into blob storageCalculate the average temperature and humidity for every 1 minute time-spanStore the average temperature and humidity information to to Azure TablesSend the average temperature and humidity information to Power BICreate reports in Power BI to display information for the current dayCreate a web application that displays Power BI reports Previous post: [IoT Home Project] Part 5 - Send data to Azure IoT Hub, control time interval and refac the configuration information GitHub source code:
Store all sensor data into blob storage Once we have sensor data in Azure IoT Hub, we can do anything with them. Let's create a Stream Analytics job that take sensor data from Azure IoT Hub and push it to Azure Blob Storage. Blob Storage is a very cheap place where you can store data. It is perfect to store bulk data. The first step when you create a job for Stream Analytics is to …