Skip to main content

How to run Azure Functions on AWS and on-premises

Nowadays cloud and hybrid cloud solutions are highly demanded by companies. In this post, we will talk about a technical solution for business logic that it’s supported by multiple clouds provides and can run on-premises without any kind of issues.
The requirements for the given scenario is the following:
We need to be able to expose an API that can execute tasks that runs for 2-5 minutes each in parallel. The native platform shall be Microsoft Azure, but it should be able to run on dedicated hardware in specific countries like China and Russia. With minimal effort, it should be able to run on the AWS platform.

The current team that we had available was a .NET Core team, with good skills on ASP.NET. There are numerous services available on Azure that can run in different environments also. The most attractive ones are the ones on top of Kubernetes and microservices.
Even so, we decided to do things a little different. We had to take into considerations that an autoscaling functionality would be important. Additional to this the current team needs to deliver the first version on top of Azure. The current team capabilities were not so good on Kubernetes and the delivery timeline was strict.

Solution
Taking this into considerations, we decided to go with a 2 steps approach. The first version of the solution would be built on top of Azure Functions 2.X, fully hosted on Microsoft Azure. The programming language that would be used is .NET Core 2.X.
 To be able to deliver what is required the following binding needs to be used:

  • HTTP that plays the role of the trigger. External systems can use this binding to kick off the execution.
  • Blob Storage that plays the role of data input. The chunk of data is delivered in JSON format from the external system and loaded to blob storage before execution starts.
  • Webhooks that plays the role as data output. An external system is notified where the result (scoring information) is provided

The power of Azure Functions is revealed at this moment. The team can focus on implementation, without a minimal effort on the infrastructure part. Initially, the solution will be hosted using Consumption Plan that enables us to scale automatically and pay per usage.
Even if it might be a little more expensive than the App Service Plan, where dedicated resources are allocated, the Consumption Plan is a good starting point. Later, based on the consumption level the solution might be or not migrated to App Service Plan.
Execution Time
When you are using the Consumption Plan, a function can run a maximum of 30 minutes. The initial estimation of a task duration is 2-5 minutes. There is a medium risk that the 30 minutes limits to be reached. To be able to mitigate this risk, during the implementation phase, the team will execute some stress tests with real data to estimate the task duration in Azure Function context.
Additional to this, every week a custom report will be generated where different KPIs related to execution times. A scatter charts, combined with a clustering chart shall be more than enough. The reports are generated inside Power BI.
The mitigation plan for this is to migrate to the App Service Plan, where you have the ability to control what kind of resources you have available and to allocate dedicated resources only for this. On top of this, in App Service Plan you don’t have a timeout limitation, you can run a function as long as you want.
There are also other mechanisms to mitigate it, like code optimization or split the execution into multiple functions, but this will be decided later on if there will be problems with the timeout time.
Remarks: Azure Functions 1.X execution timeout for Consumption Plan is 10 minutes, in comparison with Azure Function 2.X where the maximum accepted value is 30 minutes.
On-premises support
The current support of Azure Function for t on-premises system if you want to have a stable system where you can run high loads is Kubernetes cluster. The cool thing that Microsoft is offering to us is the ability to run Azure Functions inside Kubernetes cluster as a Docker image.
The tool that is available on the market at this moment is allowing us to create from an Azure Function a Docker image that it is already configured with Horizontal Pod Autoscaler. This enables us without to do any custom configuration to have an Azure Function hosted inside Kubernetes as a container that can scale automatically based on the load. Beside this the deployment and service configuration part it’s also generated.
The tool that is allowing us to do this is called Core Tools and it is designed by the Azure Function team. Besides this, because it is a command line tool can be easily integrated with CI/CD systems that we have already in place.

AWS Environment
The same solution as for on-premises can be used to host our Azure Functions inside AWS EKS or in any other services based on Kubernetes.

The official support from Core Tools is allowing us to create images for Docker and deploy them in Kubernetes using/on top of:

  • Virtual-Kubelet
  • Knative
  • Kubectl
  • AKS (Azure Kubernetes Services)
  • ACR (Azure Container Registry) – Image hosting
  • ACS (Azure Container Services)
  • AWS EKS

Azure Functions Core Tools is available for download from:

  • Github - https://github.com/Azure/azure-functions-core-tools
  • npm- azure-functions-core-tools
  • choco – azure-functions-core-tools
  • Brew - azure-functions-core-tools
  • Ubuntu - https://packages.microsoft.com/config/ubuntu/XX.XX/packages-microsoft-prod.deb 


Conclusion
As we can see at this moment Azure Functions is allowing us to not be locked to run our solution only inside Azure as Functions. We have the ability to take our functions and spin-up as a console application or even inside Kubernetes. Azure Function Core Tools is enabling us to create Docker images and run them inside Kubernetes in any kind of environment.

Comments

Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.
publ…