Skip to main content

How to run Azure Functions on AWS and on-premises

Nowadays cloud and hybrid cloud solutions are highly demanded by companies. In this post, we will talk about a technical solution for business logic that it’s supported by multiple clouds provides and can run on-premises without any kind of issues.
The requirements for the given scenario is the following:
We need to be able to expose an API that can execute tasks that runs for 2-5 minutes each in parallel. The native platform shall be Microsoft Azure, but it should be able to run on dedicated hardware in specific countries like China and Russia. With minimal effort, it should be able to run on the AWS platform.

The current team that we had available was a .NET Core team, with good skills on ASP.NET. There are numerous services available on Azure that can run in different environments also. The most attractive ones are the ones on top of Kubernetes and microservices.
Even so, we decided to do things a little different. We had to take into considerations that an autoscaling functionality would be important. Additional to this the current team needs to deliver the first version on top of Azure. The current team capabilities were not so good on Kubernetes and the delivery timeline was strict.

Solution
Taking this into considerations, we decided to go with a 2 steps approach. The first version of the solution would be built on top of Azure Functions 2.X, fully hosted on Microsoft Azure. The programming language that would be used is .NET Core 2.X.
 To be able to deliver what is required the following binding needs to be used:

  • HTTP that plays the role of the trigger. External systems can use this binding to kick off the execution.
  • Blob Storage that plays the role of data input. The chunk of data is delivered in JSON format from the external system and loaded to blob storage before execution starts.
  • Webhooks that plays the role as data output. An external system is notified where the result (scoring information) is provided

The power of Azure Functions is revealed at this moment. The team can focus on implementation, without a minimal effort on the infrastructure part. Initially, the solution will be hosted using Consumption Plan that enables us to scale automatically and pay per usage.
Even if it might be a little more expensive than the App Service Plan, where dedicated resources are allocated, the Consumption Plan is a good starting point. Later, based on the consumption level the solution might be or not migrated to App Service Plan.
Execution Time
When you are using the Consumption Plan, a function can run a maximum of 30 minutes. The initial estimation of a task duration is 2-5 minutes. There is a medium risk that the 30 minutes limits to be reached. To be able to mitigate this risk, during the implementation phase, the team will execute some stress tests with real data to estimate the task duration in Azure Function context.
Additional to this, every week a custom report will be generated where different KPIs related to execution times. A scatter charts, combined with a clustering chart shall be more than enough. The reports are generated inside Power BI.
The mitigation plan for this is to migrate to the App Service Plan, where you have the ability to control what kind of resources you have available and to allocate dedicated resources only for this. On top of this, in App Service Plan you don’t have a timeout limitation, you can run a function as long as you want.
There are also other mechanisms to mitigate it, like code optimization or split the execution into multiple functions, but this will be decided later on if there will be problems with the timeout time.
Remarks: Azure Functions 1.X execution timeout for Consumption Plan is 10 minutes, in comparison with Azure Function 2.X where the maximum accepted value is 30 minutes.
On-premises support
The current support of Azure Function for t on-premises system if you want to have a stable system where you can run high loads is Kubernetes cluster. The cool thing that Microsoft is offering to us is the ability to run Azure Functions inside Kubernetes cluster as a Docker image.
The tool that is available on the market at this moment is allowing us to create from an Azure Function a Docker image that it is already configured with Horizontal Pod Autoscaler. This enables us without to do any custom configuration to have an Azure Function hosted inside Kubernetes as a container that can scale automatically based on the load. Beside this the deployment and service configuration part it’s also generated.
The tool that is allowing us to do this is called Core Tools and it is designed by the Azure Function team. Besides this, because it is a command line tool can be easily integrated with CI/CD systems that we have already in place.

AWS Environment
The same solution as for on-premises can be used to host our Azure Functions inside AWS EKS or in any other services based on Kubernetes.

The official support from Core Tools is allowing us to create images for Docker and deploy them in Kubernetes using/on top of:

  • Virtual-Kubelet
  • Knative
  • Kubectl
  • AKS (Azure Kubernetes Services)
  • ACR (Azure Container Registry) – Image hosting
  • ACS (Azure Container Services)
  • AWS EKS

Azure Functions Core Tools is available for download from:

  • Github - https://github.com/Azure/azure-functions-core-tools
  • npm- azure-functions-core-tools
  • choco – azure-functions-core-tools
  • Brew - azure-functions-core-tools
  • Ubuntu - https://packages.microsoft.com/config/ubuntu/XX.XX/packages-microsoft-prod.deb 


Conclusion
As we can see at this moment Azure Functions is allowing us to not be locked to run our solution only inside Azure as Functions. We have the ability to take our functions and spin-up as a console application or even inside Kubernetes. Azure Function Core Tools is enabling us to create Docker images and run them inside Kubernetes in any kind of environment.

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(...

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills)

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills) The idea that moving to the cloud reduces the costs is a common misconception. The cloud infrastructure provides flexibility, scalability, and better CAPEX, but it does not guarantee lower costs without proper optimisation and management of the cloud services and infrastructure. Idle and unused resources, overprovisioning, oversize databases, and unnecessary data transfer can increase running costs. The regional pricing mode, multi-cloud complexity, and cost variety add extra complexity to the cost function. Cloud adoption without a cost governance strategy can result in unexpected expenses. Improper usage, combined with a pay-as-you-go model, can result in a nightmare for business stakeholders who cannot track and manage the monthly costs. Cloud-native services such as AI services, managed databases, and analytics platforms are powerful, provide out-of-the-shelve capabilities, and increase business agility and innovation. H...