Skip to main content

Monitor cost consumption cross Azure Subscriptions at Resource Group level

In this post we'll attack different mechanism that can be used to monitor Azure consumptions cross-subscriptions. We will take a look on the out of the box solutions that are currently available, what are the limitations and a custom solutions that could be used for this purpose.

Define the problem
There is a need of a mechanism that can offer information related to Resource Group consumptions. We need to be able to provide a total cost of all Resource Groups from all subscriptions with the same characteristic (let's say with a specific tag).
The Azure Subscriptions that needs to be monitored are under multiple EA (Azure Enterprise Agreement). Some of the Azure Subscriptions are not under EA accounts.
Out of the box solution
Microsoft Azure Enterprise content pack for Power BI
In the moment when this post was written Microsoft offers for all Azure Subscriptions that are under the same EA full integration with Power BI - Microsoft Azure Enterprise content pack for Power BI.
This pack allows user to bring cost and consumptions data to Power BI.
A dashboard is automatically added, that allow us to have an overview on consumptions, drilling down to each piece. The nice thing is that that we can see the consumptions cross subscriptions that are under the same EA, allowing us to get a report for all Resource Groups that are under the same EA.
The thing that is missing is an overview cross subscriptions from different EA and subscriptions that are not under EA. This could be solved easily if we would extend Azure Usage and Billing Portal sample. More information about this solutions will be presented later on in this post.

Azure Usage and Billing Portal
This is a great sample that gives us the ability to consolidated information related to Azure consumptions under the same Power BI source. From that point of time we can look on our data from any perspective, including grouping the cost per Resource Groups with the same characteristics cross subscriptions.
The solution provided by this sample is a good starting point, but some enhancement are necessary to be done to it.

3rd parties offers
After a review on what we can find on the marker, I was not able to find a vendor that would offer this capabilities without some additional customization.

Proposed solution
The proposed solution that I would recommend involves the core ideas of 'Azure Usage and Billing Portal' solution, extend it using tags. The tags would offer hints to the Power BI to be able to aggregate consumption of Resources Groups with specific tags.

At tags level, we have two options:
  1. Add tags at Resource Group level
  2. Add tags on each resource under the Resource Groups that need to be monitored
The downside of the first options is related to the Power BI report query complexity, where it would be necessary to search for all Resource Groups with that specific tag and do a second step searching for all resources that are under that Resource Groups.
In this moment I see the second option more simple and easy to implement, even if requires adding tags to each resource. Tags are available in the resource consumption output and can be used directly by Power BI queries to aggregate data cross subscriptions.
I would add to each resource that is under that specific Resource Group a special tag that would provide me information related to what type of Resource Group is part of (Processing, Virtualization, Data Analytics, ML, Data Crunch and so on)


Design of Azure Usage migration solution
The Azure Usage billing solution available on GitHub is pretty nice. It's a good starting point and can be used with success.
Taking a look over it, I would do some small changes to help us better to resolve our problem. Below, you can find things that I would like to change:
1. Azure Function that is triggered once per day and send consumption requests to the queue
2. Azure Function that listen to the queue end extract data consummations from the subscriptions, pushing it to SQL Azure
3. Replace the dashboard that adds the subscription with an ARM template that is trigger automatically each time when a deployment is done (in our context this is possible, we have full control what ARM scrips runs)
4. Expose an Azure API App that allows the caller to register a subscription that needs to be monitored. This API is protected by an authentication mechanism (Azure AD integration)
5. Challenge if there is need of Azure SQL or Azure Tables are enough (running cost vs implementation costs -> Azure SQL wins)
6. Remove Web App dashboard used to display consumption information. Allow only from Power BI users to be able to see reports. In our case this is enough especially because the number of users is under 5 and they might want to have the possibility to generate different reports or views.

The number of Azure Subscriptions that will be monitored is around 60-65, where there might be 10-15 types of Resource Groups that needs to be monitored, providing information related to cost of that type of Resource Group at word-wide level.

The above diagram presents an overview of the components that are involved, including the base interaction between components.

  • ARM Script: Contain a step that registers the Azure Subscription by calling the REST API
  • Azure API App: Exposed a REST API used to register an Azure Subscription that needs to be monitored. Subscription information are persisted in Azure SQL.
  • Add Consumption Request: It is an Azure Function that runs one per day that fetch data from the table that contains the list of subscription. For each subscription a separate message is pushed to Storage Queue.
  • Get Consumption Information: It is an Azure Function that read consumption information from subscription and persist in in Azure SQL. This function trigger is connected directly to the Storage Queue.
  • Reports Power BI: Is the Power BI dashboard that has the queries that allow users to see information cross subscriptions. 

Final conclusion
Yes, there is a small amount of resources that needs to be invested to create such a system, but once you have a system that push Azure consumption information to Azure SQL, any kind of reports and visualizations can be created.

Comments

Popular posts from this blog

How to check in AngularJS if a service was register or not

There are cases when you need to check in a service or a controller was register in AngularJS.
For example a valid use case is when you have the same implementation running on multiple application. In this case, you may want to intercept the HTTP provider and add a custom step there. This step don’t needs to run on all the application, only in the one where the service exist and register.
A solution for this case would be to have a flag in the configuration that specify this. In the core you would have an IF that would check the value of this flag.
Another solution is to check if a specific service was register in AngularJS or not. If the service was register that you would execute your own logic.
To check if a service was register or not in AngularJS container you need to call the ‘has’ method of ‘inhector’. It will return TRUE if the service was register.
if ($injector.has('httpInterceptorService')) { $httpProvider.interceptors.push('httpInterceptorService&#…

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Run native .NET application in Docker (.NET Framework 4.6.2)

Scope
The main scope of this post is to see how we can run a legacy application written in .NET Framework in Docker.

Context
First of all, let’s define what is a legacy application in our context. By a legacy application we understand an application that runs .NET Framework 3.5 or higher in a production environment where we don’t have any more the people or documentation that would help us to understand what is happening behind the scene.
In this scenarios, you might want to migrate the current solution from a standard environment to Docker. There are many advantages for such a migration, like:

Continuous DeploymentTestingIsolationSecurity at container levelVersioning ControlEnvironment Standardization
Until now, we didn’t had the possibility to run a .NET application in Docker. With .NET Core, there was support for .NET Core in Docker, but migration from a full .NET framework to .NET Core can be costly and even impossible. Not only because of lack of features, but also because once you…