Skip to main content

Monitor cost consumption cross Azure Subscriptions at Resource Group level

In this post we'll attack different mechanism that can be used to monitor Azure consumptions cross-subscriptions. We will take a look on the out of the box solutions that are currently available, what are the limitations and a custom solutions that could be used for this purpose.

Define the problem
There is a need of a mechanism that can offer information related to Resource Group consumptions. We need to be able to provide a total cost of all Resource Groups from all subscriptions with the same characteristic (let's say with a specific tag).
The Azure Subscriptions that needs to be monitored are under multiple EA (Azure Enterprise Agreement). Some of the Azure Subscriptions are not under EA accounts.
Out of the box solution
Microsoft Azure Enterprise content pack for Power BI
In the moment when this post was written Microsoft offers for all Azure Subscriptions that are under the same EA full integration with Power BI - Microsoft Azure Enterprise content pack for Power BI.
This pack allows user to bring cost and consumptions data to Power BI.
A dashboard is automatically added, that allow us to have an overview on consumptions, drilling down to each piece. The nice thing is that that we can see the consumptions cross subscriptions that are under the same EA, allowing us to get a report for all Resource Groups that are under the same EA.
The thing that is missing is an overview cross subscriptions from different EA and subscriptions that are not under EA. This could be solved easily if we would extend Azure Usage and Billing Portal sample. More information about this solutions will be presented later on in this post.

Azure Usage and Billing Portal
This is a great sample that gives us the ability to consolidated information related to Azure consumptions under the same Power BI source. From that point of time we can look on our data from any perspective, including grouping the cost per Resource Groups with the same characteristics cross subscriptions.
The solution provided by this sample is a good starting point, but some enhancement are necessary to be done to it.

3rd parties offers
After a review on what we can find on the marker, I was not able to find a vendor that would offer this capabilities without some additional customization.

Proposed solution
The proposed solution that I would recommend involves the core ideas of 'Azure Usage and Billing Portal' solution, extend it using tags. The tags would offer hints to the Power BI to be able to aggregate consumption of Resources Groups with specific tags.

At tags level, we have two options:
  1. Add tags at Resource Group level
  2. Add tags on each resource under the Resource Groups that need to be monitored
The downside of the first options is related to the Power BI report query complexity, where it would be necessary to search for all Resource Groups with that specific tag and do a second step searching for all resources that are under that Resource Groups.
In this moment I see the second option more simple and easy to implement, even if requires adding tags to each resource. Tags are available in the resource consumption output and can be used directly by Power BI queries to aggregate data cross subscriptions.
I would add to each resource that is under that specific Resource Group a special tag that would provide me information related to what type of Resource Group is part of (Processing, Virtualization, Data Analytics, ML, Data Crunch and so on)


Design of Azure Usage migration solution
The Azure Usage billing solution available on GitHub is pretty nice. It's a good starting point and can be used with success.
Taking a look over it, I would do some small changes to help us better to resolve our problem. Below, you can find things that I would like to change:
1. Azure Function that is triggered once per day and send consumption requests to the queue
2. Azure Function that listen to the queue end extract data consummations from the subscriptions, pushing it to SQL Azure
3. Replace the dashboard that adds the subscription with an ARM template that is trigger automatically each time when a deployment is done (in our context this is possible, we have full control what ARM scrips runs)
4. Expose an Azure API App that allows the caller to register a subscription that needs to be monitored. This API is protected by an authentication mechanism (Azure AD integration)
5. Challenge if there is need of Azure SQL or Azure Tables are enough (running cost vs implementation costs -> Azure SQL wins)
6. Remove Web App dashboard used to display consumption information. Allow only from Power BI users to be able to see reports. In our case this is enough especially because the number of users is under 5 and they might want to have the possibility to generate different reports or views.

The number of Azure Subscriptions that will be monitored is around 60-65, where there might be 10-15 types of Resource Groups that needs to be monitored, providing information related to cost of that type of Resource Group at word-wide level.

The above diagram presents an overview of the components that are involved, including the base interaction between components.

  • ARM Script: Contain a step that registers the Azure Subscription by calling the REST API
  • Azure API App: Exposed a REST API used to register an Azure Subscription that needs to be monitored. Subscription information are persisted in Azure SQL.
  • Add Consumption Request: It is an Azure Function that runs one per day that fetch data from the table that contains the list of subscription. For each subscription a separate message is pushed to Storage Queue.
  • Get Consumption Information: It is an Azure Function that read consumption information from subscription and persist in in Azure SQL. This function trigger is connected directly to the Storage Queue.
  • Reports Power BI: Is the Power BI dashboard that has the queries that allow users to see information cross subscriptions. 

Final conclusion
Yes, there is a small amount of resources that needs to be invested to create such a system, but once you have a system that push Azure consumption information to Azure SQL, any kind of reports and visualizations can be created.

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see

Navigating Cloud Strategy after Azure Central US Region Outage

 Looking back, July 19, 2024, was challenging for customers using Microsoft Azure or Windows machines. Two major outages affected customers using CrowdStrike Falcon or Microsoft Azure computation resources in the Central US. These two outages affected many people and put many businesses on pause for a few hours or even days. The overlap of these two issues was a nightmare for travellers. In addition to blue screens in the airport terminals, they could not get additional information from the airport website, airline personnel, or the support line because they were affected by the outage in the Central US region or the CrowdStrike outage.   But what happened in reality? A faulty CrowdStrike update affected Windows computers globally, from airports and healthcare to small businesses, affecting over 8.5m computers. Even if the Falson Sensor software defect was identified and a fix deployed shortly after, the recovery took longer. In parallel with CrowdStrike, Microsoft provided a too