Skip to main content

Azure Storage authentication using Azure AD

Cloud and Azure became a complex environment, with a high number of services and things that you need to take into account. Even so, you still use the base services in the day to day work like Azure VMs, Storage or Network capacities.
Azure Storage it is one of the most used services inside Microsoft Azure directly or indirectly. Any other services that are running inside the cloud need this services to be able to store or work with data.
Until now, access control to Azure was possible using Key Accounts or Shared Access Signatures (SAS). The combination of this two is powerful and can cover most of the scenarios of small and mid-size companies.
Issues appear for enterprise customers, where (Azure) Active Directory is part of their core services. For them, access management to Azure Storage is crucial to be controlled using AD. Especially for an organization where there are 10.000 employees or more, sharing access to resources using SAS tokens is possible, but adds extra complexity to the system. The problems appear in the moment when you need to remove access to specific storage, where things are not so straightforward when you use Shared Access Signatures or Shared Access Policies.

In the second quarter of this year, Azure team announced the support of Azure AD authentification and authorization for Azure Storage. These give us the perfect mechanism to be able to control access to data using a consolidated solution over all the organization. It enables IT team to be able to remove or allow access to Azure Storage content on the fly, based on user roles.
This new feature in combination with Manage Service Identify (MSI) allows teams to assign specific AD roles to application and services that are running on top of Azure, not only to people. In this way, even at the application layer, we can have granular access control to Azure Storage content.
Behind the scene, a request is made to Azure AD using OAuth 2.0 protocol. The Azure AD provides an access token to the application that can be used to access Azure Storage. The first thing that needs to be done is to configure Azure Storage RBAC to enable access to a security principal, specifying what you want to access and permissions rights.
The full configuration flow can be a little complex because involves creating and registering an Azure AD Application and grand access to Azure Storage, but the process is well documented and is straightforward.

Things that you shall consider:
  1. At this moment the service is available only for Blob storage and Queues
  2. The Storage Account needs to be created using Resource Management model
  3. Custom RBAC roles are supported
  4. Access control can be controlled at container and queue level (at blob level it is not recommended)

Being in preview the following limitations exist:
  1. Page blobs access for premium storage it is not yet available
  2. Logging information for Azure Storage Analytics it is not yet supported

This great new feature allows us to integrate and manage storage access in a centralized location, together with all other systems that are using Azure AD.


Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.