Skip to main content

Azure Queue Storage (Day 20 of 31)

List of all posts from this series:

Short Description 
Azure Queue Storage is a services part of Azure Storage that offer queueing support. It is a simple queue services that can be accessed from anywhere over HTTP and HTTPS protocol.

Main Features 
First think that comes in my mind is the size of the queue. Because this service is construed over Azure Storage, the maximum size of a queue(s) can be hundreds of terra bytes (under the same storage account). This can useful when we need to store large amount of messages in a queue.
Like all Azure Services, the Queue Storage can be accessed and manage over a REST API. On top of this there are libraries for different programing languages like C#, NodeJS, and so on.
Batch Support
For reading messages from queue, we have the ability to consume them in batches of 32 messages.
Transactions support (kind off)
When consuming messages from queues we have a ‘kind of transaction support’. This mean that for each message from queue, we need to send a ‘delete’ command to the queue for removing it. A message that is send to a client for consuming, cannot be received (visible) by other clients for a specific time. If the message is not deleted in that specific time, the message will be available for clients.
Queue Length
You have the ability to get an estimation of the number of messages that are in the queue.
For each message we can set the Time To Leave property. The maximum value that is accepted is 7 days. It can be very useful when you need to remove old messages automatically.
XML or JSON format
The communication with API can be made in both formats. Clients can specify in each request what format they want to use.
Pooling Support
The current libraries and APIs allow us to create a poling mechanism and check if new messages are available.
Queue Storage give us multiple options when we talk about redundancy. By default we have redundancy at data center level – this mean that all the time there are 3 copies of the same content (and you pay only one). On top of this there are other 3 options of redundancy that I will describe below (plus the one that we already talk about):

  • LRS (Local Redundant Storage) – Content is replicate 3 times in the same data center (facility unit from region) 
  • GRS (Geo Redundant Storage) – Content is replicated 6 times across 2 regions (3 times in the same region and 3 times across different regions)
  • RA- GRS (Read Access Geo Redundant Storage) – Content is replicated in the same way as for GRS, but you have read only access in the second region. For the GRS even if the data exist in the second region you cannot access it directly. 

Pay only what you use 
At the end of the month you will pay only the space that you used. Because of this, clients don’t pay in advance the space that will be used or to pay for the space that is not used anymore.
Tracing capabilities 
Storage Queue has tracing abilities over blobs and containers. Information like access time, client IP and how request ended can automatically be stored and accessed. In this way we can have a full audit over storage content.
Unlimited queues and messages
Because the maximum size is limited to 200TB, we can say that we can have unlimited queues and messages. It is a very scalable solution.


  • Maximum size of a message is 64KB.
  • Maximum number of messages per second in a single queue is 2000.
  • Maximum Time To Leave for each message is 7 days.
  • No order guaranty.
  • Only Peek & Lease mode supported (for receiver).
  • No batch support for sender

Applicable Use Cases 
Below you can find 3 use cases when I would use Azure Queue.
Storing large amounts of messages
If I need a system that can handle and store large amounts of messages and I can afford to lose data (because of TTL), than Azure Queue can be a very good solutions.
State Machine
We could use Azure Queue if we need to create a state machine, for example Orders Management and we need to be able to track different orders, change their state and so on.
Distribute work between instances
We could use Azure Queue to distributed messages between different machines. We could use this message system to distribute the load to all our available resources.

Code Sample 
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(

// Create the queue client.
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();

// Retrieve a reference to a queue.
CloudQueue queue = queueClient.GetQueueReference("myqueue");

// Create the queue if it doesn't already exist.

// Create a message and add it to the queue.
CloudQueueMessage message = new CloudQueueMessage("Hello, World");

// Peek at the next message
CloudQueueMessage peekedMessage = queue.PeekMessage();

// Display message.


Pros and Cons 

  • Extremely scalable
  • Support for TTL 
  • Support for Batch support


  • No real transaction support
  • Only one mechanism for consuming messages from queue
  • Size of messages is limited to 64KB

When you start to calculate the cost of Azure Blob Storage you should take into account the following things:

  • Capacity (size) 
  • Number of Transactions 
  • Outbound traffic 
  • Traffic between facilities (data centers)

Azure Queue is a scalable and good queuing messaging system. It is very simple but can be perfect for different scenarios. If you need a more features from a queueing system that you should take a look over Service Bus Queue, Topics or Event Hub.


Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.