Skip to main content

Encrypt binary content stored on Azure Storage supported out of the box - Azure Storage Service Encryption for Data at Rest

Small things make the real difference between a good product and a great one. In this post we will talk about why Azure Storage Service Encryption for Data at Rest is so important.

Context
When you decide to store confidential data in cloud or in an external data source means that there is a trust between you and the storage provider (in our case Microsoft Azure). Many time this is not enough. There are also laws that force you to encrypt the content from the moment when binary content leaves your network until reach the next destination or will be access again.
In this context, even if you trust Microsoft enough or any other cloud provider, this will not be enough. And any paper or certification will not be enough in front of the laws. For industries like banking or health this kind of scenarios are common and migration to cloud is hard and even impossible in some situations

Encryption Layers
When we discuss about encryption and security of data, there are multiple layers where we need to provide encryption. In general, when we define application that move data or access it from client secure environment we need to provide security at:

  • Transport Layer
  • Storage Layer

If we have a system that transfer data from location A to location B, then we might have:

  • Transport Layer
  • Transit Layer

But in the end, Storage and Transit Layer at the same thing. In one case we persistent content for long time, in comparison with the other one when we store data only for a specific time interval.

Transport Layer
At Transport Layer, Azure Storage supports HTTPs, that automatically secure us our transport Layer. Unfortunately, we are not allowed to use our own certificates, only Microsoft certificates are allowed. In most cases, this is enough and is not a blocker.
Also, for normal consumers, cloud provider certificates are safer than custom client certificates.

Storage Layer
What we had until now
Until now, we didn’t have any mechanism to encryption content at storage layer. The only mechanism available until now was to encrypt content before sending in on the wire. This solution works great when you have enough CPU power of you don’t need to encryption to much traffic. Otherwise client encryption is expensive and requires the user to manage by himself the encryption keys. This is an out of the box features offered by Azure SDK. Client libraries allows us to encrypt content based on our own keys. To secure and control who has access to this keys we can use Azure Key Vault. This mechanism is great, but we are the one that need to manage everything and the encryption is done on the client side.

What we have starting from now
Starting from now we, Microsoft Azure allows us to do this encryption at REST endpoint directly using Azure Storage Service Encryption for Data at Rest.
Long name, but the idea is very simple. All the content that will be stored in Azure Storage will be encrypted automatically by Azure, before storing it on the disk. In the moment when we want to access the information, the content will be decrypted before sending in back to as.

All this activities are transparent to the user. The client doesn't need to do something special for it. Once the encryption is activated per Azure Storage Account from the Azure Portal, all content that is written from that moment will be encrypted.

Encryption Algorithm
The encryption algorithm used by Azure Storage in this moment in time is AES 256 (Advanced Encryption Standard with the key length of 256 bits).
This is a well known standard, that is accepted and used by Governments and other companies around the word. This standard includes ISO/IEC 18033-3 standard, being safe enough to be used in most industries.

Encryption Key Management
The key management is done fully by Microsoft. Clients are not allowed to come with their own keys.

Content Replication
If you have activated the geo-replication feature, all the content that is written in the main region and in the geo-replicas will be encrypted.

What we can encrypt
In this moment in time we can encrypt any kind of content that is stored on Blobs (Block Blobs, Append Blobs and Page Blobs), VHDs and OS disks.
There is no way to encrypt content that is stored on Azure Tables, Files or Azure Queues.

What happens for content that already exists on the Azure Storage
You are allowed to activate this feature any time once you create an Azure Storage. Once you activate this feature, all the content that is written after this moment will be encrypted. Existing content will remain in 'clear text'.
If you need to encrypt content that was already written in your storage, you need to read and write it again. Tools like AzCopy can be used with success in this scenarios.
A similar thing happens when you disable encryption. From that moment all the content will be written in 'plain text'. Existing content will remain encrypted until the next write.

Azure Storage Account Types
Only the new storage accounts support encryption - ARM. The Azure Storage Accounts that were created using in the classical format (Classic Storage Account) doesn't support encryption.

Price
There is no additional fee for this service.

How to activate this feature
This feature can be activated from Azure Portal or using PowerShell.


Conclusion
This is a great feature that can be used with success to offer end-to-end encryption of data, from the moment when data leaves your premises until you get it back. You can get end-to-end encryption without extra costs of custom implementation.
Only by activating this feature and using HTTPs you have this out of the box. Cool!

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(...

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see...

Navigating Cloud Strategy after Azure Central US Region Outage

 Looking back, July 19, 2024, was challenging for customers using Microsoft Azure or Windows machines. Two major outages affected customers using CrowdStrike Falcon or Microsoft Azure computation resources in the Central US. These two outages affected many people and put many businesses on pause for a few hours or even days. The overlap of these two issues was a nightmare for travellers. In addition to blue screens in the airport terminals, they could not get additional information from the airport website, airline personnel, or the support line because they were affected by the outage in the Central US region or the CrowdStrike outage.   But what happened in reality? A faulty CrowdStrike update affected Windows computers globally, from airports and healthcare to small businesses, affecting over 8.5m computers. Even if the Falson Sensor software defect was identified and a fix deployed shortly after, the recovery took longer. In parallel with CrowdStrike, Microsoft provi...