Skip to main content

Azure Functions - Features Review

In the last two posts related to Azure Functions we talk about how we can write functions that can process images from OneDrive and the current integration with Visual Studio and Continuous Integration.
In this post we will take a look over the main features, price model.

Personally, this is one of the strongest points of Azure Functions. You pay only what you use, when your code is not running or is not called by an external trigger, you don't need to pay. You cannot want more from the hosting perspective.
When you have zero clients, you pay zero. When you have n clients, you pay for each of them. The pricing model will be described later on.

Multiple-languages support
The number of languages that are supported is not 1 or 2. In this moment we have support for C#, F#, JS (by Node.JS), PHP, Python, batch, bash.
Another interesting feature is that you can come with any executable that can run as an Azure Function. In interesting concept, that I promise that I will follow-up in the future.

Source Control and CI
There is full integration with TFS and GIT. You can use triggers to make automatically deployment from different branches. You can even deploy the functions from OneDrive or DropBox.

External Dependencies
I was surprise when I discover that I can use NuGet (or NPM) packages inside an Azure Function. In this was you can use predefined libraries.

SaaS Integration fully supported
Even if the naming is not clear, if we translate it would mean: Integration and hooks with external services that can be part of Azure platform or not.
For example we have integration with Azure Event Hub, Document DB, Storage (Object :) ) and so on, but also with external service providers like Google Drive.

The list of triggers is pretty long. You will find from time triggers to message or GitHub triggers. The one that I like is webhook trigger that enables us to integrate Azure Function with our own services.

There is full support for OAuth, having out of the box integration with Facebook, Google, Twitter, Azure AD and Microsoft Accounts.

Service Plans
There two types of service plans that are available for Azure Functions.

App Service Plan
The classical one - App Service Plan - is the one that we already know and we are used to from Web Apps. When you select this plan, your Azure Functions will run in your own App Service Plan and will use resources that are available in it.
Azure Functions will scale automatically in your App Service Plan as long as there are enough resources. It is the right place to run your functions, when the load on your functions is constant or you want to have resource reservation.
The downside with this service plan is that you will for it eve if you don't run the functions. This happens because when you have an App Service Plan you pay for the VMs that you in the cluster.
This plan is very appealing if beside Azure Functions you have Web Apps and other applications that runs. You could put all of them inside the same App Service Plan at can be a great solution for systems were load is constant and a good load forecast can be done.

Consumption Plan
The mindset is 180 degrees different in comparison with App Service Plan. For Consumption Plan you don't need an App Service Plan, VMs or anythings else. You just deploy your code and run it. You will pay for each time when you function is running.
There are two types of units for which you'll have to pay:

  • Resource Consumption - A way to measure how much resources our function consumed that is calculated  based on how much memory you used during the time period when your functions are running
  • Execution - numbers of times when a function is triggered. But the price is low, for 1 million execution you'll pay €0.1687
Similar with web jobs, the service is free until you reach a specific limit. For Azure Functions the service is free for first 1 million execution and for the first 400.000 Gb-s consumed (Resource Consumption).
The Consumption Plan will scale automatically by monitoring the triggers. Based on the load of a queue or other triggers the system can decide if it is necessary to scale up or down. For triggers that doesn't offer a count (size) - like a generic web hook, this might create some problem with scaling.
For situations when we know that we need to scale fast we shall try to use as trigger sources that offer also a counter that can be monitored by Azure Functions.

Monitoring - Function Pulse
A base monitoring system is supported our of the box. Information like invocation history with the last status of each run is available. All information that we log in a function using TraceWritter can be accessed, same for exception.
Beside this, a live stream is available, where we can see in real time what functions runs and what is their output. This output can be redirected to a power shell or console app or in any other location where we want.

Azure Functions can be test easily. Based on a well know URL you can trigger your function (as long as you offer all inputs that are required to run) - this is applicable for functions that has as trigger a we hook or an http request. Based on your trigger you'll need to adapt your tests.

Remote Debugging
As I presented in a previous post, remote debugging is fully supported from Visual Studio. You can attach easily to you functions, catch errors, add breaking points. Life is more simple now.

CORS Support 
By default, this feature is disable. From the configuration panel, we are allowed to activate it and specify the list of domains that are allowed to make calls.

Each function runs single thread. If multiple calls are happening in parallel, the hosting plan can decide to run functions in parallel. This is much better in comparison with AWS Lambda that execute only one function each time. The palatalization level can vary based on the consumption level and trigger type.

From the features perspective, Azure Functions are a powerful service that enable us to develop and run code without thinking about infrastructure. In the future this services is a game changer, that you need to take into consideration when you design a solution on top of Azure.


Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.