Skip to main content

Azure Tools - Azure SAS Generator

I decided to start a new series of articles about tools that can be used by Developers to improve their experience with Microsoft Azure. Each week I will publish a post related to a tool. The main focus is on tools that are free of charge, but sometimes I might include paid one.
Context: Having discussions with various technical people, especially from the development area, I realize that the tools that they are aware of are limited. There are so many tools that can improve our experience when we work with Microsoft Azure. Even so, many times we rely only on a few of them.

Highlights of Azure SAS Generator

Azure Services: Azure Storage (blobs, tables and queues)
Cost: free of use
How is delivered: Native Windows application
Top 3 features:
  • #1 SAS key generator
  • #2 Works offline
  • #3 Update the SAS key dynamically when you change an attribute

Pain points:
  • #1 Sometimes freeze and does not generate the keys
  • #2 Cannot add accounts using your Azure account
  • #3 Does not load the resources dynamically from the account storage

Credits: Jeffrey M Richter (https://twitter.com/jeffrichter)

Today I will start with Azure SAS Generator, that enables us to generate Shared Access Signatures (SAS) for Azure Storage. Using this tool you can generate SAS for Containers, Blobs, Tables and Queues.

What I like about these tools is the way how the SAS is generated. Each time when you change an attribute to the key it automatically updates the generated key. Clean and easy to use, it is the kind of tool that I highly recommend when you want to learn how a SAS is generated, what does it contain and the attributes of the key. You have the ability to specify the version of SAS key you want to use. Even so in general, we use the latest one, but there might be situations when you want to use older versions.
The tool is not perfect, sometimes you might find out that the key is not generated. In these situations change the Resource Type from Blob to Tables and back to Blob. This actions should fix the problem. Adding new storage accounts can be done only using the account name and key. You might prefer to log in using your account credentials, but at least, in this case, you know all the time what account key it is used to generate the SAS key.

TIP: SAS keys are generated using the account key. When you regenerate the account key all the SAS keys generated using that account key will be invalidated.

The application can be used to generate SAS keys in offline mode, because generating this kind of access keys does not require to make a call to Azure Storage - they just required the account key to generate the signature of the SAS – “sig
There is a strange name in the interface used for one of the properties of the SAS key - “Signature identifier”. It represents the SAS policy in the case when you want to use one. The app is not able to load the SAS policies that exist for a specific account, but you can specify it.

Comments

Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.
publ…