Skip to main content

(Part 1) Azure Storage - Cold Tier | Things that we should know

In this post we will talk about things that we need to consider if we want to use Cold storage of Azure Storage. This post we’ll be followed by another one that will present use cases when cold storage tier is useful and how we should integrated it in our system.
(Part 2) Azure Storage - Cold Tier | Business use cases where we could use it

Cold storage tier of Azure Storage is a new tier that is available for Azure Storage for data that needs to be persisted for long time with minimum costs. This tier should be used when data is not accessed frequently. In comparison with Hot tier, cold storage latency can be a little bit higher, but still it is in milliseconds.
Azure Storage contains multiple types of storages like Azure Queues, Azure Tables, Azure Blobs (block blob, append blob and page blob). Unfortunately, their tier is available only for block blob and append blob. It means that we cannot store an Azure Table or a Page Blob in the cold tier.
The Storage Account that allows us to have access to this storage tier also is only available when we create a new storage account and we specify at the account type “Blob Storage” and not “General Purpose”. Only in this way we will have access to the cold storage tier.

Things that we should take into account
There are some things that you should consider also. Once you create an Azure Storage account that has support for Cold/Hot tier you can switch between. Switching between this two tier is not free of cost. The cost of migration from one tier to another is equivalent with cost of reading all your data from that storage account.
Existing storage accounts that are General Purpose or old storage account (so called classical ones) cannot be change to Blob Storage kind. It means that for this storages we cannot have cold and hot tier functionality.

For this scenarios it is necessary to migrate all the content to specific storage accounts. This can be made easily using AzCopy tool, that will make the migration for you. This tools can be used directly from command prompt or write a small app on top of Azure Data Movement library that is using AzCopy behind the scene.
The API that can be used for block and append blob it is the same as for normal Storage Accounts. If you do such a migration that are no change inside your app code. Only the connection string that points to the Azure Storage account needs to be changed.

HOT vs COLD Tier
If we compare this two tiers, at SLA level there is a small different for the Availability SLA. Where from 99.99% (RA-GRS) availability for Hot tier, we have only 99.9 availability for Cold storage.
From cost perspective, HOT storage cost of storage is higher than the Cold one (more than double), but in the same time the transaction cost is almost double for Cold storage. This is an important thing that we need to take into account when we make the cost estimation. Also on top of this,  there are costs for reading and writing to cold tire in comparison with hot tier when you pay only for outbound data, that go outside the Azure Region
The latency is the same for both of them – expressed in milliseconds.

When you are using the Geo-Replication feature for Cold tier, the replication cost in the secondary region will be as you read the data from the main location.

In the next post we will take a look on use cases when we should use Cold tier and how the migration should be realizes.
(Part 2) Azure Storage - Cold Tier | Business use cases where we could use it


Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…