Skip to main content

Azure Functions - Features Review

In the last two posts related to Azure Functions we talk about how we can write functions that can process images from OneDrive and the current integration with Visual Studio and Continuous Integration.
In this post we will take a look over the main features, price model.

Pay-per-use
Personally, this is one of the strongest points of Azure Functions. You pay only what you use, when your code is not running or is not called by an external trigger, you don't need to pay. You cannot want more from the hosting perspective.
When you have zero clients, you pay zero. When you have n clients, you pay for each of them. The pricing model will be described later on.

Multiple-languages support
The number of languages that are supported is not 1 or 2. In this moment we have support for C#, F#, JS (by Node.JS), PHP, Python, batch, bash.
Another interesting feature is that you can come with any executable that can run as an Azure Function. In interesting concept, that I promise that I will follow-up in the future.

Source Control and CI
There is full integration with TFS and GIT. You can use triggers to make automatically deployment from different branches. You can even deploy the functions from OneDrive or DropBox.

External Dependencies
I was surprise when I discover that I can use NuGet (or NPM) packages inside an Azure Function. In this was you can use predefined libraries.

SaaS Integration fully supported
Even if the naming is not clear, if we translate it would mean: Integration and hooks with external services that can be part of Azure platform or not.
For example we have integration with Azure Event Hub, Document DB, Storage (Object :) ) and so on, but also with external service providers like Google Drive.

Triggers
The list of triggers is pretty long. You will find from time triggers to message or GitHub triggers. The one that I like is webhook trigger that enables us to integrate Azure Function with our own services.

Security
There is full support for OAuth, having out of the box integration with Facebook, Google, Twitter, Azure AD and Microsoft Accounts.

Service Plans
There two types of service plans that are available for Azure Functions.

App Service Plan
The classical one - App Service Plan - is the one that we already know and we are used to from Web Apps. When you select this plan, your Azure Functions will run in your own App Service Plan and will use resources that are available in it.
Azure Functions will scale automatically in your App Service Plan as long as there are enough resources. It is the right place to run your functions, when the load on your functions is constant or you want to have resource reservation.
The downside with this service plan is that you will for it eve if you don't run the functions. This happens because when you have an App Service Plan you pay for the VMs that you in the cluster.
This plan is very appealing if beside Azure Functions you have Web Apps and other applications that runs. You could put all of them inside the same App Service Plan at can be a great solution for systems were load is constant and a good load forecast can be done.

Consumption Plan
The mindset is 180 degrees different in comparison with App Service Plan. For Consumption Plan you don't need an App Service Plan, VMs or anythings else. You just deploy your code and run it. You will pay for each time when you function is running.
There are two types of units for which you'll have to pay:

  • Resource Consumption - A way to measure how much resources our function consumed that is calculated  based on how much memory you used during the time period when your functions are running
  • Execution - numbers of times when a function is triggered. But the price is low, for 1 million execution you'll pay €0.1687
Similar with web jobs, the service is free until you reach a specific limit. For Azure Functions the service is free for first 1 million execution and for the first 400.000 Gb-s consumed (Resource Consumption).
The Consumption Plan will scale automatically by monitoring the triggers. Based on the load of a queue or other triggers the system can decide if it is necessary to scale up or down. For triggers that doesn't offer a count (size) - like a generic web hook, this might create some problem with scaling.
For situations when we know that we need to scale fast we shall try to use as trigger sources that offer also a counter that can be monitored by Azure Functions.

Monitoring - Function Pulse
A base monitoring system is supported our of the box. Information like invocation history with the last status of each run is available. All information that we log in a function using TraceWritter can be accessed, same for exception.
Beside this, a live stream is available, where we can see in real time what functions runs and what is their output. This output can be redirected to a power shell or console app or in any other location where we want.

Testing
Azure Functions can be test easily. Based on a well know URL you can trigger your function (as long as you offer all inputs that are required to run) - this is applicable for functions that has as trigger a we hook or an http request. Based on your trigger you'll need to adapt your tests.

Remote Debugging
As I presented in a previous post, remote debugging is fully supported from Visual Studio. You can attach easily to you functions, catch errors, add breaking points. Life is more simple now.

CORS Support 
By default, this feature is disable. From the configuration panel, we are allowed to activate it and specify the list of domains that are allowed to make calls.

Threading
Each function runs single thread. If multiple calls are happening in parallel, the hosting plan can decide to run functions in parallel. This is much better in comparison with AWS Lambda that execute only one function each time. The palatalization level can vary based on the consumption level and trigger type.

Conclusion
From the features perspective, Azure Functions are a powerful service that enable us to develop and run code without thinking about infrastructure. In the future this services is a game changer, that you need to take into consideration when you design a solution on top of Azure.

Comments

Popular posts from this blog

How to check in AngularJS if a service was register or not

There are cases when you need to check in a service or a controller was register in AngularJS.
For example a valid use case is when you have the same implementation running on multiple application. In this case, you may want to intercept the HTTP provider and add a custom step there. This step don’t needs to run on all the application, only in the one where the service exist and register.
A solution for this case would be to have a flag in the configuration that specify this. In the core you would have an IF that would check the value of this flag.
Another solution is to check if a specific service was register in AngularJS or not. If the service was register that you would execute your own logic.
To check if a service was register or not in AngularJS container you need to call the ‘has’ method of ‘inhector’. It will return TRUE if the service was register.
if ($injector.has('httpInterceptorService')) { $httpProvider.interceptors.push('httpInterceptorService&#…

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Run native .NET application in Docker (.NET Framework 4.6.2)

Scope
The main scope of this post is to see how we can run a legacy application written in .NET Framework in Docker.

Context
First of all, let’s define what is a legacy application in our context. By a legacy application we understand an application that runs .NET Framework 3.5 or higher in a production environment where we don’t have any more the people or documentation that would help us to understand what is happening behind the scene.
In this scenarios, you might want to migrate the current solution from a standard environment to Docker. There are many advantages for such a migration, like:

Continuous DeploymentTestingIsolationSecurity at container levelVersioning ControlEnvironment Standardization
Until now, we didn’t had the possibility to run a .NET application in Docker. With .NET Core, there was support for .NET Core in Docker, but migration from a full .NET framework to .NET Core can be costly and even impossible. Not only because of lack of features, but also because once you…