Skip to main content

Some scenarios when we can use Shared Access Signature from Windows Azure


In this post we will talk about some possible scenario when Shared Access Policy can be used. We will discuss about a different scenario for blobs, tables and queues.

Blobs

We are a graphical designer that creates templates for web applications. We decide that we what to offer this content to any persons based on a subscription that is paid for each template package. A template package can contain 100 web applications template for example.
How we can share this content very easily with our clients. A simple solution is using blobs and Shared Access Signature. For each client we create an access token that allow to access the templates for which he paid. Over this structure we create a web-application that allow user to download the templates packages after he insert his access token that is send via email.

Table

Let’s imagine an application that store on Azure table information about stock reports for the each weeks. This information is generated based on a lot of computation power. Because of this the company decides to share sell this valuable content based on a weekly subscriptions. A client can have access only to information for the weeks that he paid.
A solution could be to write a service that retrieves all this content based for a given password. To be able to do this we would need to register our client, authenticate them each time and so on.
A more simple solution is to put this content in Azure table and use Shared Access Signature to give read only access to our clients. Based on the unique token, the client will be able to access and query this information in any way they want. From our perspective, we can define for each table what partitions keys and rows keys a client can access. Each week, all we need to do is to update the Shared Access Signature for clients that paid the subscription for the next week. This can be done automatically without any problems.

Queues

We can imagine that we the biggest ice producer from our region and we create ice for other companies that sell it to general public (we create in 5 locations). We create a system that process each command based on the arrival. Our partners submit a request on our web site and based on their location and loading we are able allocate them to a specific ice fabric. Each ice fabric could have allocated a queue where from it could process the commands.
In the summer, the ice request is so big that we cannot delivery it. Because of this we decide to work with small ice factory that can help us. A solution to share all the commands for ice, but without sharing client and price information is to use Shared Access Policy over queues.
Each of our partners could get a limited access to the commands and queue and produce ice for us for a specific time. When we don’t need any more their help we can restrict the access to the commands queues.

There are a lot of scenarios that we could imagine. Don’t expect that Shared Access Signature to be the only solutions for our problems. But there will be times, when this is the simples solution, when we don’t need to implement something more and use an out of the box mechanism.

Tutorials about Shared Access Signature:
  1. Overview
  2. How to use Shared Access Signature with tables from Windows Azure
  3. How to use Shared Access Signature with blobs from Windows Azure
  4. How to use Shared Access Signature with queues from Windows Azure
  5. How to remove or edit a Shared Access Signature from Windows Azure 
  6. Some scenarios when we can use Shared Access Signature from Windows Azure

Comments

Popular posts from this blog

How to check in AngularJS if a service was register or not

There are cases when you need to check in a service or a controller was register in AngularJS.
For example a valid use case is when you have the same implementation running on multiple application. In this case, you may want to intercept the HTTP provider and add a custom step there. This step don’t needs to run on all the application, only in the one where the service exist and register.
A solution for this case would be to have a flag in the configuration that specify this. In the core you would have an IF that would check the value of this flag.
Another solution is to check if a specific service was register in AngularJS or not. If the service was register that you would execute your own logic.
To check if a service was register or not in AngularJS container you need to call the ‘has’ method of ‘inhector’. It will return TRUE if the service was register.
if ($injector.has('httpInterceptorService')) { $httpProvider.interceptors.push('httpInterceptorService&#…

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Run native .NET application in Docker (.NET Framework 4.6.2)

Scope
The main scope of this post is to see how we can run a legacy application written in .NET Framework in Docker.

Context
First of all, let’s define what is a legacy application in our context. By a legacy application we understand an application that runs .NET Framework 3.5 or higher in a production environment where we don’t have any more the people or documentation that would help us to understand what is happening behind the scene.
In this scenarios, you might want to migrate the current solution from a standard environment to Docker. There are many advantages for such a migration, like:

Continuous DeploymentTestingIsolationSecurity at container levelVersioning ControlEnvironment Standardization
Until now, we didn’t had the possibility to run a .NET application in Docker. With .NET Core, there was support for .NET Core in Docker, but migration from a full .NET framework to .NET Core can be costly and even impossible. Not only because of lack of features, but also because once you…