Skip to main content

I need a cache system, what service from Azure should I use?

On the market there are a lot of caching solution. The same thing is happening on Microsoft Azure. In this moment we have 4 different solution for caching (Azure Redis Cache, Managed Cache Service, In-Role Cache and Shared Cache).
A normal question that I heard from different people was: What kind of cache solution should I use?
This is a normal question, because having so many option you don’t know what to choose. I will try to describe what are the benefits of each cache solution that are available on Azure and see what we should use.

Azure Shared Cache
Azure Shared Cache is the oldest cache solution that was offered on Azure and is still available (for old clients) It was retired this year and can be accessed from Silverlight portal.
 This solution used to offer cache support as a service and I can say that I was able to use it with success on 3 or 4 project. The thing that I really enjoyed was the API, which was very simple to use and understand.

In-Role Cache
This cache solution enable us to host our cache system in Azure Roles. A great thing was that you could share a part of memory ram that was able on Azure Roles for cache. In this way you didn’t had to pay more for this resources – co-located topology.
If needed, you could even a dedicated role only for cache – dedicated topology. The biggest advantage of this solution was that you would could had the cache in the same role where your application would run – minimum latency.

Azure Managed Cache
When this cache service was launched, I saw him like Azure Shared Cache but with steroids. It is based on the same principle, cache as a service. This service added a lot of new features to the original cache solution especially from performance, scalability and security perspective.
When Azure Shared Cache retired, we were able to migrate to this cache service very easily. Under 2h, of working including testing (smile).
TIP: If you design your application as it should be, separating external services and resources from the core, you will be able to migrate very easily from one service to another.

Azure Redis Cache
Yes, Redis Cache. The open source one, that in this moment is used heavily in IT industry. You can have one or more Redis nodes with a cache size maximum to 53GB. What I like to this key/value store is that is very fast and you have a master/slave architecture, that offer you a persistent cache solution. Features like transactions, TTL (Time To Live) and sorted list are supported.
Imagine a service, based on Redis Cache, that is managed by Microsoft. You only need to use it. How great is that. It is important to know that is the same Redis Cache that is available as open source. It is not a custom implementation of it.

And now, the real question:
What kind of Cache solution should I use?

Well… even if the question sounds tricky, the response is pretty simple. All features that are available on all 4 solutions can be found on Azure Redis Cache.
Azure Shared Cache was already retired, so you should not use it. Azure Managed Cache is very similar with Azure Redis Cache, but without some features like atomic operation, list support, set operations (intersection, union, difference) and many more. From this perspective, Redis is much better than Azure Managed Cache.
In-Role cache can be cheaper than Redis, because you use resources that are in your roles, but from performance perspective, there is a huge difference between them. The last think why I didn’t use until now In-Role Cache is because of the SLA. You don’t have a SLA for In-Role cache because id running in your own roles.

Conclusion
If you start a new project in this moment you should go for Azure Redis Cache. Also keep I mind that cloud is very dynamic and separate cache service from your application as much as possible. In this way you will be open to change with minimum cost.

Comments

Popular posts from this blog

Why Database Modernization Matters for AI

  When companies transition to the cloud, they typically begin with applications and virtual machines, which is often the easier part of the process. The actual complexity arises later when databases are moved. To save time and effort, cloud adoption is more of a cloud migration in an IaaS manner, fulfilling current, but not future needs. Even organisations that are already in the cloud find that their databases, although “migrated,” are not genuinely modernised. This disparity becomes particularly evident when they begin to explore AI technologies. Understanding Modernisation Beyond Migration Database modernisation is distinct from merely relocating an outdated database to Azure. It's about making your data layer ready for future needs, like automation, real-time analytics, and AI capabilities. AI needs high throughput, which can be achieved using native DB cloud capabilities. When your database runs in a traditional setup (even hosted in the cloud), in that case, you will enc...

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

[Post Event] Azure AI Connect, March 2025

On March 13th, I had the opportunity to speak at Azure AI Connect about modern AI architectures.  My session focused on the importance of modernizing cloud systems to efficiently handle the increasing payload generated by AI.