Skip to main content

Azure Tools - Azure SAS Generator

I decided to start a new series of articles about tools that can be used by Developers to improve their experience with Microsoft Azure. Each week I will publish a post related to a tool. The main focus is on tools that are free of charge, but sometimes I might include paid one.
Context: Having discussions with various technical people, especially from the development area, I realize that the tools that they are aware of are limited. There are so many tools that can improve our experience when we work with Microsoft Azure. Even so, many times we rely only on a few of them.

Highlights of Azure SAS Generator

Azure Services: Azure Storage (blobs, tables and queues)
Cost: free of use
How is delivered: Native Windows application
Top 3 features:
  • #1 SAS key generator
  • #2 Works offline
  • #3 Update the SAS key dynamically when you change an attribute

Pain points:
  • #1 Sometimes freeze and does not generate the keys
  • #2 Cannot add accounts using your Azure account
  • #3 Does not load the resources dynamically from the account storage

Credits: Jeffrey M Richter (https://twitter.com/jeffrichter)

Today I will start with Azure SAS Generator, that enables us to generate Shared Access Signatures (SAS) for Azure Storage. Using this tool you can generate SAS for Containers, Blobs, Tables and Queues.

What I like about these tools is the way how the SAS is generated. Each time when you change an attribute to the key it automatically updates the generated key. Clean and easy to use, it is the kind of tool that I highly recommend when you want to learn how a SAS is generated, what does it contain and the attributes of the key. You have the ability to specify the version of SAS key you want to use. Even so in general, we use the latest one, but there might be situations when you want to use older versions.
The tool is not perfect, sometimes you might find out that the key is not generated. In these situations change the Resource Type from Blob to Tables and back to Blob. This actions should fix the problem. Adding new storage accounts can be done only using the account name and key. You might prefer to log in using your account credentials, but at least, in this case, you know all the time what account key it is used to generate the SAS key.

TIP: SAS keys are generated using the account key. When you regenerate the account key all the SAS keys generated using that account key will be invalidated.

The application can be used to generate SAS keys in offline mode, because generating this kind of access keys does not require to make a call to Azure Storage - they just required the account key to generate the signature of the SAS – “sig
There is a strange name in the interface used for one of the properties of the SAS key - “Signature identifier”. It represents the SAS policy in the case when you want to use one. The app is not able to load the SAS policies that exist for a specific account, but you can specify it.

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see

Navigating Cloud Strategy after Azure Central US Region Outage

 Looking back, July 19, 2024, was challenging for customers using Microsoft Azure or Windows machines. Two major outages affected customers using CrowdStrike Falcon or Microsoft Azure computation resources in the Central US. These two outages affected many people and put many businesses on pause for a few hours or even days. The overlap of these two issues was a nightmare for travellers. In addition to blue screens in the airport terminals, they could not get additional information from the airport website, airline personnel, or the support line because they were affected by the outage in the Central US region or the CrowdStrike outage.   But what happened in reality? A faulty CrowdStrike update affected Windows computers globally, from airports and healthcare to small businesses, affecting over 8.5m computers. Even if the Falson Sensor software defect was identified and a fix deployed shortly after, the recovery took longer. In parallel with CrowdStrike, Microsoft provided a too