Skip to main content


Showing posts from 2017

Control Azure Users Access using Role-Based Access Control

Problem As a customer I want to be able to restrict user access and rights to Azure Resources that are under the same Azure Subscription.
The requirement can be extended a little more by specifying that a user needs to be able to view and access only Azure Resources that he is allowed. He shall be able to create or modify resources only specific resources.    If possible, all resources that user access or create should be under a predefined subnet.
Options There are multiple approaches to a request like this. Even if Role-Based Access Control is a powerful mechanism to control access, the way how we can restrict access is limited and doesn't allow us to restrict fully the user as we need. A classical approach is to allow user to run ARM (Azure Resource Management) scripts only through a custom component that is hosted by us. Having full control on this component we can implement any kind of logic and business restriction. The downside of a solution like this is complexity and cost.…

[Post Event] Global Azure Bootcamp event, Cluj-Napoca, April 22th, 2017

For the 5th year in the row,  ITCamp community from Cluj-Napoca joined Global Azure Bootcamp event. In comparison with other communities events organized by in Cluj-Napoca, Global Azure Bootcamp is different because is purely hands-on lab.
There were 3 different workshops of 90 minutes each where attendees had the opportunity to play and test Azure Functions, Machine Learning and Azure Resources Managers (ARM). We respected this year also the tradition to organize the event at Endava (ISDC) building, where the location is perfect for hands-on labs. Special thank you for our local sponsor - Endava.

09:00-09:30 - Sosirea participanților
09:30-11:00 - Azure Functions (Radu Vunvulea)
11:00-12:30 - Machine learning for mere mortals with Azure ML (Silviu Niculita)
12:30-14:00 - ARM Templates, how to create them, and use them in your CD pipeline (Florin Loghiade)

Workshops descriptions
Azure Functions (Radu Vunvulea) 

What are Azure Functions? AWS Lambda from Azure. This is the fastest w…

[Post Event] Microsoft Data Amp—where data gets to work meet-up

Together with ITCamp community from Cluj-Napoca we joined our forces for 2 hours. We wanted to find more about how we can transform our business with our data -Microsoft Data Amp.
A lot of new stuff were announced during the live streaming that can be viewed on
If you ask me, the game changers are SQL Server 2017 that runs on Linux and the fully integration between SQL Server and other stacks like R. Nowadays you don't need to move your data outside the Database engine to execute ML. You have the ML full support direct in the database engine.

Running Azure Stream Analytics on Edge Devices

Azure Stream Analytics enable us to analyze data in real time. Connecting multiple streams of data, running queries on top of them without having to deploy complex infrastructure become possible using Azure Stream Analytics.
This Azure service is extremely powerful when you have data in the cloud. But there was no way to analyze the data stream at device or gateway level.
For example if you are in a medical laboratory you might have 8 or 12 analyzers. To be able to analyze the counters of all this devices to detect a malfunction you would need to push counters value to Azure, even if you don't need for other use cases.

Wait! There is a new service in town that enable us to run the same Azure Stream Analytics queries but at gateway or device level - Azure Stream Analytics on Edge Devices. Even if the name is long and has Azure in the name, the service is a stand alone service that runs on-premises.
This service allows us to run the same queries in real time over data streams that w…

Global Azure Bootcamp la Cluj-Napoca | April 22, 2017

Acesta este al cincilea an când ITCamp community organizează Global Azure Boot Camp. Acesta este un eveniment la nivel global care are loc în peste 159 de locații. Ca și anul trecut, Clujul nu se lasă mai prejos și apare pe harta Azure. Pe data de 22 Aprilie vă invităm pe toți la acest eveniment din Cluj-Napoca, care va conține 3 workshop-uri. Link pentru înregistrare: Participarea la eveniment este GRATUITĂ, așa cum a fost și până acum la orice eveniment organizat de comunitatea ITCamp. Pe data de 22 Aprilie ne propunem să avem 3 workshop-uri de câte 90 de minute fiecare, unde putem să învățăm împreună cum să folosim diferite servicii Azure. Fiecare workshop conține o parte teoretică și una practică. Din această cauză, o să aveți nevoie de un laptop. De ce aveți nevoie: Laptop + Visual Studio + Microsoft Azure SDK + Un cont de Azure Dacă doriți puteți să vă grupați în grupuri de 2-3 persoane la același laptop. Link pentru înregistra…

[Post-Event] Cluj ITCodecamp 2017: Azure Data Lake for super-developers

The best way to end an weekend is by a conference. So, this is what I've done. At the end of this weekend I joined Codecamp conference in Cluj-Napoca. As a free event, Codecamp gather IT speakers and attendees from all technologies and companies, being an opportunity to meet other people.

In figures, there were around 900 people, 60 speakers and around 70 session - all of this in one day. At this conference I delivered a high-level session about Azure Data Lake. Being a new service, I decided that a high level session about it is the best approach. Below you can find more about my session.

Azure Data Lake for super-developers
Nowadays digital information is produced by each device that we touch. What we do after we receive this data can change the way how we do business, but to be able to store and process this data we need a powerful system like Azure Data Lake. This session we will discover the secrets that are behind Azure Data Lake and why this service should be…

SQL Server v.Next - Data, analytics and artificial intelligence live event - April 19th, 2017

The IT landscape is changing every day. Nowadays it is more usual than ever to talk about data collecting, processing and visualization. If a few years ago we use to talk in TB of data, now PB is a normal data unit.
I remember than 10 years ago having a database that was bigger than 1GB it was acceptable, but not very common for small or medium company size. If we are looking now around us, a database of 20GB is a commodity. Anyone will think the database is to big or you need special hardware for it.
Security became now the biggest concern, where you persist data, what data you persist, who access what data are the questions that you need to face up.
5 years ago if you would say that you want to run SQL Server on Linux... it would be a joke, but look where we are not. We have SQL Server on Linux, we have containers, Docker and we see how Microsoft embrace the open source word.
We are exited to find out more about new features that SQL v.Next is preparing to us. It's not only abou…

[Post-Event] April BucharestJS Meetup #21 | Building Node.JS Together (with Microsoft)

This week I had the opportunity to be invited by BucharestJS meetup to speak about Node.JS from Microsoft perspective.
It was a challenge for me - speaking about Microsoft in a front of Node.JS people is not an easy task. I was happy to survive after one afternoon with around 80 people with no mercy to Microsoft and Azure. I'm sure that a part of them will give a try to Visual Studio Code and/or Azure Services.

For me it was a pleasure to find people that were open to discover 'exotic' IDE and products. Thank you to all attendees that accepted for one hour to hear about Microsoft technologies.
More information related to session, slides, demos can be find below.

Title: Building Node.js Together
Abstract: JavaScript's simplicity, ubiquity, and event-driven paradigm, and its high performance runtimes like Microsoft's Chakra and Google's V8, have led it to spread beyond web pages and browsers to servers and clouds, IoT devices, mobile apps, and more via the Node.js …

Monitor cost consumption cross Azure Subscriptions at Resource Group level

In this post we'll attack different mechanism that can be used to monitor Azure consumptions cross-subscriptions. We will take a look on the out of the box solutions that are currently available, what are the limitations and a custom solutions that could be used for this purpose.

Define the problem
There is a need of a mechanism that can offer information related to Resource Group consumptions. We need to be able to provide a total cost of all Resource Groups from all subscriptions with the same characteristic (let's say with a specific tag).
The Azure Subscriptions that needs to be monitored are under multiple EA (Azure Enterprise Agreement). Some of the Azure Subscriptions are not under EA accounts.
Out of the box solution
Microsoft Azure Enterprise content pack for Power BI
In the moment when this post was written Microsoft offers for all Azure Subscriptions that are under the same EA full integration with Power BI - Microsoft Azure Enterprise content pack for Power BI.
This p…

VM Creation: Custom Scrips vs Custom Images

When we need custom applications or configuration to be done on the VM we can do this on Azure in two ways:
Custom ISOCustom scripts extensions (known also as Formula in DevLabs context)  I noticed that a recurrent questions appears in discussions with different people: When I should use custom ISO vs custom scripts extensions?
Before jumping to a discussion where we would compare this two options and what are the advantages/disadvantages of each option, let's see what are the steps involved to create a script of an ISO.

Custom ISO We can create a custom ISO on our local machine, with all our applications installed on it. Once we have the ISO created we just need to take our VHD and prepared it for Azure. More about this steps can be found on Microsoft documentation (Capture a managed image of a generalized VM in Azure and Create custom VM images).
Custom scripts extensions
Custom scripts are executed after the VHD is deployed on the VM. Using this scripts we can push or install any k…

ARM Scripts - Extending T-shirt size concept

Working with Azure Resource Manager (ARM) deployment scripts as with any other scripts can be a challenge. Especially in the moment when you want to run a deployment script.

How often did you discover that to be able to run a script you need to specify a lot of parameters? The happy case is when a default value is already specified and even if you don't know what happens behind you don't care and only 'click next button'.
I observed that the number of parameters is directly connected with the size and complexity of the deployment. After a specific threshold, the number of parameters that don't have a default value is high, making almost impossible to run the scripts.
Because of this, complex pre-deployment steps would require a lot of time, especially when it is the first times when you make that deployment or something change.

I remember one time I saw a deployment scripts written in ARM and PowerShell that was a state of an art from the way how it was designe…

[IoT Home Project] Part 9 - Extending Azure Function to support Thieves Alarm

In this post we will discover how to
Crunch distance (sonar) information produced by a GrovePI sensor and send using Raspberry PI and Azure IoT Hub to backendAdd filters on top of Service Bus Topic Subscriptions to receive only information related to distance and temperature on each subscriptionAdd new functionality to the current portal to be able to display alarm and notify our userStory
In this moment we have a system that is able to collect metrics from Raspberry and process some of the data. We already have from GrovePI a sensor that calculate the distance from the sensor to an object. Why not to use it to detect is something is moving in front of it and create a simple alarm system.
Yes, this is not the best sensor that we could use, although this scenario is a great method to learn something new.

Previous post: [IoT Home Project] Part 8 - Connecting to Azure Function and to a virtual heat pump
GitHub source code:

What we have until now

[IoT Home Project] Part 8 - Connecting to Azure Function and to a virtual heat pump

In this post we will discover how to:
Push content from Azure Stream Analytics to Azure Service BusWrite an Azure Function in Node.JS that fetch data from Azure Service Bus Topic and push it to Azure TableDevelop a ASP.NET Core application that plays the role of a heating system that start/stop the heating in a houseStory:
Use temperature data collected from sensors connected to Raspberry PI to start/shop a heating system from a housePrevious post: [IoT Home Project] Part 7 - Read/Write data to device twin
GitHub source code: 

Push content from Azure Stream Analytics to Azure Service Bus
This step is the most simple one. We just need to Add a new output to Stream Analytics and specify the Service Bus Topic where we want to push data. After this, we'll need to update the Stream Analytic query by adding: SELECT * INTO outputSensorDataTopic FROM avgdata , where 'outputSensorDataTopic' is the name of the output that we crea…

ARM and Resources Policies

Managing an Azure Subscription where 10 or 20 people have access can become a nightmare after the a few weeks. You will end up with people that creates resources will wrong name or in the right location.
You might want to have control what kind or resources can be created in specific location, what name should be used or what tags needs to be set for each resource. I remember once we even had a resource created in another subscription, that was discovered after almost one year.

What about Role-Based Access Control (RBAC)? Aren't the same...
RBAC and Resource Policies are focusing on different thing. They are complementary and together give you ability to have control at all levels. Role-Base Access is focussing on managing rights and actions that that a user can do. In contrast using resource policies you enforce naming convention at resource level, what resources can be created, in what Azure Regions and so on.
A good example of using RBAC and resource policies is when you gi…

Blue and Red Network Topology in Azure (Virtual Network without internet access)

When you create the infrastructure of a system, it is common to have two separate network. One that has access to public internet where front-end system are hosted (web pages, web services). Another network is without internet access, more protected from intruders. In this kind of secure network backend services are hosted, databases and other private services.

In this post we will do the following references:

Blue Network - The network that had direct access to internetRed Network - The network that doesn't have access to internet and can be accessed only from internal systems

As we can see above, from Public Internet a requests goes to the front-end layer (Blue Network). Systems that runs in the Blue Network can make a direct call to the Red Network systems directly. But a request cannot goes from Public Internet directly to the Red Network. Red Network configuration (e.g. routers, firewalls) don't allow requests from Public Internet.

Can we have such a network topolog…

Azure Load Balancers - HTTP/External/Internal/Global load balancers

In this post we'll take a look on different types of load balancers that are available in Azure. The main scope is to understand what is the role of each load balancer and when we shall use each type of load balancer.

What is a Load Balancer?
It is a system that distributes the workload across multiple instances.
For more information related to how does it work and what are the base principles I invite you to take a look on Wikipedia -

What types of Load Balancer are available on Azure?
Personally I would say that there are 4 types of load balancers that are available on Azure:

HTTP based Load BalancerInternal Load BalancerExternal Load BalancerGlobal Load Balancer Each of them can be used in different situations and they never exclude each others. This means that you might need to use multiple types of Load Balancer, based on what you want to achieve. 
HTTP based Load Balancer (Application Gateway) The most common one in our d…

Why back-off mechanism are critical in cloud solutions

What is a back-off mechanism?
It's one of the most basic mechanism used for communication between two systems. The core idea of this mechanism is to decrease the frequency of requests send from system A to system B if there are no data or communication issues are detected.
There are multiple implementation of this system, but I'm 100% sure that you use it directly of indirectly. A retry mechanism that increase the time period is a back-off mechanism.

Why it is important in cloud solutions?
In contrast with a classical system, a cloud subscription will come at the end of the month with a bill details that will contain all the costs.
People discover often too late that there are many services where you pay also for each request (transaction) that you do on that service. For example, each request that is done to Azure Storage or to Azure Service Bus is billable. Of course the price is extremely low - Azure Service Bus costs is around 4 cents for 1.000.000 requests, but when you ha…