Skip to main content

Azure Front Door custom domain quote limit and solutions

When you reach the quotes of Azure Services you need to roll up your sleeves and go back to the design board. 

Business context

A company has around 20-50 products that are available in the EU, APAC and all US states. Each product can have around 5-15 different presentation web site (including custom domain) in each country. 

Technical constraint

The customer is using Azure and one of the technical objectives is to use only Azure Services, without any other 3rd party providers. Azure Front Door is in front of their API, used to map all the custom domains redirection to the main domain and to manage the security rules. The security rules are changed often (every day) - at least 3-4 rules per day, making the WAF component of Azure Front Door a goldmine for the operational team.

US: 20 products X 10 custom domains X 50 US states =10000 web site and 1000 custom domains

          EU: 20 products X 10 custom domains X 10 EU countries =2000 web site and 1000 custom domains 

The reality is not that we have 12000 custom domains in the EU and the US. For now, there are around 900 and the forecast for the next 3 years is up to 2000 custom domains in total. 

The current quote of the custom domain for Azure Front Door and the rest of Azure services is 500 per instance. This is a hard limit and cannot be changed. To be able to handle 900 or 2000 custom domains we need to find another method to map the domains.

Possible solutions

The top 3 options available, that are easy to implement are:

  1. Spin-up Azure VMs or a containerization solution and install a reserve proxy (Nginx). Use Azure Front Door for the main domain and for all security rules (WAF)
    • Pros: easy to configure
    • Cons: scalability, availability and the risk the reach the 500 custom domain limitation
  2. Use a 3rd party provider in the front for custom domain redirect and keep Azure Front Door for the main domain and for all security rules (WAF). 
    • Pros: low operational cost
    • Cons: 3rd party, that is not part of Azure Services
  3. Use multiple Azure Front Doors, one as the main, for the main domain and all security rules (WAF). Configure multiple Azure Front Doors in the front of the main one only for custom domain redirects. 
    • Pros: easy to manage and all services are part of Azure portfolio
    • Cons: Multiple Azure Front Door instances might add an extra complexity layer

Option no. 3 with multiple instances of Azure Front Door looks to be the winner. Easy to configure, scalable and cost prediction done easily. For example, for 900 custom domains, we need:
  • 1 X Premium instance for the main domain and WAF / €139
  • 2 X Standard instances for the custom domain mapping / €29.4 (2 X €14.7)
For 2000 custom domains, we need to increase the no. of Standard instances:
  • 1 X Premium instance for the main domain and WAF / €139
  • 2 X Standard instances for the custom domain mapping / €58.8 (4 X €14.7)
Another advantage of this solution is the segregation of roles. The role that manages the WAF and security rules is different from the one that is managing the custom domain. You can even extend the concept and have Azure Front Door instances per region for the custom domains, fully managed by the local teams.

Comments

Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see

Azure AD and AWS Cognito side-by-side

In the last few weeks, I was involved in multiple opportunities on Microsoft Azure and Amazon, where we had to analyse AWS Cognito, Azure AD and other solutions that are available on the market. I decided to consolidate in one post all features and differences that I identified for both of them that we should need to take into account. Take into account that Azure AD is an identity and access management services well integrated with Microsoft stack. In comparison, AWS Cognito is just a user sign-up, sign-in and access control and nothing more. The focus is not on the main features, is more on small things that can make a difference when you want to decide where we want to store and manage our users.  This information might be useful in the future when we need to decide where we want to keep and manage our users.  Feature Azure AD (B2C, B2C) AWS Cognito Access token lifetime Default 1h – the value is configurable 1h – cannot be modified

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th