Skip to main content

Deep dive in Append Blob - A new type of Azure Storage Blob

In this post we will take a look about a new type of blob - Append Blob
Until now, we had two types of blobs in Azure Storage:
  • Page Blob
  • Block Blob

Block Blob is useful when we needs to work with large files. Each block can have different sized (max 4 MB) and we can work with each block independently. Features similar with transactions are supported on blocks. The maximum size of a block blob is 195 GB.
Page Blob are optimized for random access. A Page Blob is a collection of pages of 512 bytes and are useful when we need to support random access to the content (read and write). We can refer to a specific location of a Page Blob very similar with a cursor. The maximum size of a Page Blob is 1 TB. When you create a Page Blob you need to specify the maximum size of the Page Blob (and you will pay for it from the beginning, even if you don't use all the space.

Append Blob

Concept
This new type of blob is formed by multiple blocks. Each time when you create a block, it will be added to the end of the blob. There is no support to modify or delete an existing block. Once a block was added to the end of the Append Blob it will be automatically available for read operations.
The size of each block is not fixed, the maximum size is 4 MB. The maximum size of an Append Blob is the same as for Block Blob - 195 GB. We can even say that an Append Blob is like a Block Blob, but optimized for appending block at the end of the blob.



Sample Code
The default Nuget library allow us to work with Append Blob in a very similar way how we work with Append Blob. The library allows us to append a stream, an array of bytes, text of even the content of a specific file.
The below sample, presents how we can append a text to our blob.
CloudStorageAccount cloudStorageAccount = CloudStorageAccount
  .Parse(CloudConfigurationManager.GetSetting("FooStorageConnectionString"));
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("fooContainer");
cloudBlobContainer.CreateIfNotExists();

CloudAppendBlob cloudAppendBlob = cloudBlobContainer.GetAppendBlobReference("fooBlob.txt");
cloudAppendBlob.CreateOrReplace();
cloudAppendBlob.AppendText("Content added");
cloudAppendBlob.AppendText("More content added");
cloudAppendBlob.AppendText("Even more content added");

string appendBlobContent = cloudAppendBlob.DownloadText();

When to use Append Blob?
Append blob should be used when we need to append content to a blob and we don't care about the order. It is very useful in use cases like logging. For logging we can have multiple threads and processed that needs to write content on the same blob. For a case like this, Append Blob is the perfect solution, allowing us to dump the logs in a fast and safe way.

Conclusion
In conclusion we can say that Append Blob is a type of blob that allow us:

  • To append content at the end of the blob
  • Maximum size is 195 GB
  • Allows us to have multiple writers on different blocks
  • Perfect for logging situations
  • Doesn't allow us to modify or delete a block
  • A block is available for read in the moment when was written

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

Azure AD and AWS Cognito side-by-side

In the last few weeks, I was involved in multiple opportunities on Microsoft Azure and Amazon, where we had to analyse AWS Cognito, Azure AD and other solutions that are available on the market. I decided to consolidate in one post all features and differences that I identified for both of them that we should need to take into account. Take into account that Azure AD is an identity and access management services well integrated with Microsoft stack. In comparison, AWS Cognito is just a user sign-up, sign-in and access control and nothing more. The focus is not on the main features, is more on small things that can make a difference when you want to decide where we want to store and manage our users.  This information might be useful in the future when we need to decide where we want to keep and manage our users.  Feature Azure AD (B2C, B2C) AWS Cognito Access token lifetime Default 1h – the value is configurable 1h – cannot be modified

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see