Skip to main content

Monitor cost consumption cross Azure Subscriptions at Resource Group level

In this post we'll attack different mechanism that can be used to monitor Azure consumptions cross-subscriptions. We will take a look on the out of the box solutions that are currently available, what are the limitations and a custom solutions that could be used for this purpose.

Define the problem
There is a need of a mechanism that can offer information related to Resource Group consumptions. We need to be able to provide a total cost of all Resource Groups from all subscriptions with the same characteristic (let's say with a specific tag).
The Azure Subscriptions that needs to be monitored are under multiple EA (Azure Enterprise Agreement). Some of the Azure Subscriptions are not under EA accounts.
Out of the box solution
Microsoft Azure Enterprise content pack for Power BI
In the moment when this post was written Microsoft offers for all Azure Subscriptions that are under the same EA full integration with Power BI - Microsoft Azure Enterprise content pack for Power BI.
This pack allows user to bring cost and consumptions data to Power BI.
A dashboard is automatically added, that allow us to have an overview on consumptions, drilling down to each piece. The nice thing is that that we can see the consumptions cross subscriptions that are under the same EA, allowing us to get a report for all Resource Groups that are under the same EA.
The thing that is missing is an overview cross subscriptions from different EA and subscriptions that are not under EA. This could be solved easily if we would extend Azure Usage and Billing Portal sample. More information about this solutions will be presented later on in this post.

Azure Usage and Billing Portal
This is a great sample that gives us the ability to consolidated information related to Azure consumptions under the same Power BI source. From that point of time we can look on our data from any perspective, including grouping the cost per Resource Groups with the same characteristics cross subscriptions.
The solution provided by this sample is a good starting point, but some enhancement are necessary to be done to it.

3rd parties offers
After a review on what we can find on the marker, I was not able to find a vendor that would offer this capabilities without some additional customization.

Proposed solution
The proposed solution that I would recommend involves the core ideas of 'Azure Usage and Billing Portal' solution, extend it using tags. The tags would offer hints to the Power BI to be able to aggregate consumption of Resources Groups with specific tags.

At tags level, we have two options:
  1. Add tags at Resource Group level
  2. Add tags on each resource under the Resource Groups that need to be monitored
The downside of the first options is related to the Power BI report query complexity, where it would be necessary to search for all Resource Groups with that specific tag and do a second step searching for all resources that are under that Resource Groups.
In this moment I see the second option more simple and easy to implement, even if requires adding tags to each resource. Tags are available in the resource consumption output and can be used directly by Power BI queries to aggregate data cross subscriptions.
I would add to each resource that is under that specific Resource Group a special tag that would provide me information related to what type of Resource Group is part of (Processing, Virtualization, Data Analytics, ML, Data Crunch and so on)

Design of Azure Usage migration solution
The Azure Usage billing solution available on GitHub is pretty nice. It's a good starting point and can be used with success.
Taking a look over it, I would do some small changes to help us better to resolve our problem. Below, you can find things that I would like to change:
1. Azure Function that is triggered once per day and send consumption requests to the queue
2. Azure Function that listen to the queue end extract data consummations from the subscriptions, pushing it to SQL Azure
3. Replace the dashboard that adds the subscription with an ARM template that is trigger automatically each time when a deployment is done (in our context this is possible, we have full control what ARM scrips runs)
4. Expose an Azure API App that allows the caller to register a subscription that needs to be monitored. This API is protected by an authentication mechanism (Azure AD integration)
5. Challenge if there is need of Azure SQL or Azure Tables are enough (running cost vs implementation costs -> Azure SQL wins)
6. Remove Web App dashboard used to display consumption information. Allow only from Power BI users to be able to see reports. In our case this is enough especially because the number of users is under 5 and they might want to have the possibility to generate different reports or views.

The number of Azure Subscriptions that will be monitored is around 60-65, where there might be 10-15 types of Resource Groups that needs to be monitored, providing information related to cost of that type of Resource Group at word-wide level.

The above diagram presents an overview of the components that are involved, including the base interaction between components.

  • ARM Script: Contain a step that registers the Azure Subscription by calling the REST API
  • Azure API App: Exposed a REST API used to register an Azure Subscription that needs to be monitored. Subscription information are persisted in Azure SQL.
  • Add Consumption Request: It is an Azure Function that runs one per day that fetch data from the table that contains the list of subscription. For each subscription a separate message is pushed to Storage Queue.
  • Get Consumption Information: It is an Azure Function that read consumption information from subscription and persist in in Azure SQL. This function trigger is connected directly to the Storage Queue.
  • Reports Power BI: Is the Power BI dashboard that has the queries that allow users to see information cross subscriptions. 

Final conclusion
Yes, there is a small amount of resources that needs to be invested to create such a system, but once you have a system that push Azure consumption information to Azure SQL, any kind of reports and visualizations can be created.


Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…