Skip to main content

(Part 1) Azure Storage - Cold Tier | Things that we should know

In this post we will talk about things that we need to consider if we want to use Cold storage of Azure Storage. This post we’ll be followed by another one that will present use cases when cold storage tier is useful and how we should integrated it in our system.
(Part 2) Azure Storage - Cold Tier | Business use cases where we could use it

Overview
Cold storage tier of Azure Storage is a new tier that is available for Azure Storage for data that needs to be persisted for long time with minimum costs. This tier should be used when data is not accessed frequently. In comparison with Hot tier, cold storage latency can be a little bit higher, but still it is in milliseconds.
Azure Storage contains multiple types of storages like Azure Queues, Azure Tables, Azure Blobs (block blob, append blob and page blob). Unfortunately, their tier is available only for block blob and append blob. It means that we cannot store an Azure Table or a Page Blob in the cold tier.
The Storage Account that allows us to have access to this storage tier also is only available when we create a new storage account and we specify at the account type “Blob Storage” and not “General Purpose”. Only in this way we will have access to the cold storage tier.

Things that we should take into account
There are some things that you should consider also. Once you create an Azure Storage account that has support for Cold/Hot tier you can switch between. Switching between this two tier is not free of cost. The cost of migration from one tier to another is equivalent with cost of reading all your data from that storage account.
Existing storage accounts that are General Purpose or old storage account (so called classical ones) cannot be change to Blob Storage kind. It means that for this storages we cannot have cold and hot tier functionality.

For this scenarios it is necessary to migrate all the content to specific storage accounts. This can be made easily using AzCopy tool, that will make the migration for you. This tools can be used directly from command prompt or write a small app on top of Azure Data Movement library that is using AzCopy behind the scene.
The API that can be used for block and append blob it is the same as for normal Storage Accounts. If you do such a migration that are no change inside your app code. Only the connection string that points to the Azure Storage account needs to be changed.

HOT vs COLD Tier
If we compare this two tiers, at SLA level there is a small different for the Availability SLA. Where from 99.99% (RA-GRS) availability for Hot tier, we have only 99.9 availability for Cold storage.
From cost perspective, HOT storage cost of storage is higher than the Cold one (more than double), but in the same time the transaction cost is almost double for Cold storage. This is an important thing that we need to take into account when we make the cost estimation. Also on top of this,  there are costs for reading and writing to cold tire in comparison with hot tier when you pay only for outbound data, that go outside the Azure Region
The latency is the same for both of them – expressed in milliseconds.


When you are using the Geo-Replication feature for Cold tier, the replication cost in the secondary region will be as you read the data from the main location.

In the next post we will take a look on use cases when we should use Cold tier and how the migration should be realizes.
(Part 2) Azure Storage - Cold Tier | Business use cases where we could use it

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(...

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see...

Navigating Cloud Strategy after Azure Central US Region Outage

 Looking back, July 19, 2024, was challenging for customers using Microsoft Azure or Windows machines. Two major outages affected customers using CrowdStrike Falcon or Microsoft Azure computation resources in the Central US. These two outages affected many people and put many businesses on pause for a few hours or even days. The overlap of these two issues was a nightmare for travellers. In addition to blue screens in the airport terminals, they could not get additional information from the airport website, airline personnel, or the support line because they were affected by the outage in the Central US region or the CrowdStrike outage.   But what happened in reality? A faulty CrowdStrike update affected Windows computers globally, from airports and healthcare to small businesses, affecting over 8.5m computers. Even if the Falson Sensor software defect was identified and a fix deployed shortly after, the recovery took longer. In parallel with CrowdStrike, Microsoft provi...