Skip to main content

Where can I backup my content - Azure File Storage

Nowadays taking pictures is so easy, you can end up easily with 20-30 shots per day. After a few years of using external HDD as backups I realize that is a time for a change.
Until now I used OneDrive to backup pictures from my phone and 3 external HDD that were putted in mirror. This setup works great as long you have enough storage and you external HDDs.
But now I ended up with the OneDrive storage at 99.8% full (from 30GB) and with external storage almost full also.

What I need?
I looked up for a solution that is cheap, can be accessed from anywhere I don't need to install custom software on my machine to be able to access it. The ideal solution would be to be able to attach the storage as a partition.
Buying external HDD s is not a solution anymore for me anymore. I would prefer a cloud or storage provider that could offer me a reliable storage for a good price.

Azure Offer
An interesting offer is coming from Microsoft Azure, more exactly from Azure File Storage. The service is part of Azure Storage and allow us to access Azure Storage as a file system. Having the classical folders, files and so on.

Azure File
The concept is simple and powerful. Under your Azure Subscription you can create a Storage Account that can be used to store files under a directory structure. You can upload, access, remove content from any location.
From technical perspective, each 'drive' (if we can call in this way), that you create under Storage Account is called Shared. The maximum size is 5TB, that is more than enough. Under a Share you can have as many directories you want. The number of files is not limited, but the maximum size of a file is limited to 1TB. When you create a new file share you need to specify the quota limit (max size of the file share). This quota can be changed anytime without having to recopy the content.
For my personal needs, this is more and enough. If I would need more than 5TB, than I can create multiple Shares under the same Storage Account.

This is one of the things that attracted me. You have multiple tires that you can use. From 3 replicas in the same Azure Region you can go with more complex ones where content is replicated in another Azure Region where read access is also available (RA-GRS).
Based on how often you access your data there are two types of tires that can be used Cool or Hot tier. Hot tier is used when the content is access and updated often. For backups scenarios, like in my case, Cold storage is a better option. Usually this kind of backups are not accessed often. In my case, I don't read/access all the more often than one time per month.
For this kind of backups the best solution is a Cold Storage that is Geo-Redundant Storage (GRS). This means that storage is replicated in two different Azure Regions. In each Azure Region that are 3 different copies of your content.
When you calculate the cost you need to take into account 3 different things

  • Storage cost (€1.69 per month for 100GB)
  • Transaction cost (€0.16 or €0.0084 for 100.000, based on transaction type)
  • Data Read (€0.0084 per GB) or Data Write (€0.0042 per GB)
For a normal use I calculated that I will never pay more than €1.8 per months for 100GB. This means that the cost per year is under €18, in the context where the content is replicated in two different locations and is accessible from anywhere.

Content access
The content can be accessed directly from Azure Portal, but of course you don't want to do something like this when you have 50.000 files. 
On top of this we have Microsoft Azure Storage Explorer. This nice tool allow us to access content directly from desktop. There is full support for Windows, Linux and Mac. 2The experience is nice and similar with the one that you have from File Explorer.
For access your content from this tools you can use your Azure account credentials or Azure Storage account name and access key. If it is your first time when you use Azure Storage I recommend to check a nice tutorial provided by Microsoft - About Azure storage accounts.

This is not all. You can attach the file storage as a partition on your system. With only one command executed in command line you can do this. This magic can happens because Azure Files Storage has support for SMB 3.0 protocol. This means that you can mount the file share on Windows 7+ or Windows Server 2008+ operation system or Linux machines.
// Windows Machine
net use 
     v: \\\pictures 

// Linux Machine
sudo mount -t 
     cifs // 
     [mount point] -o vers=3.0,

Simple like this, you can attach the Azure File Storage as a partition.

Azure File Storage is a good option if you need to backup you content. The price per month is low and being able to attach it as a partition to my computer makes this solution the best one on the market.


  1. Sounds too good to be true - I wounder what the terns&conditions say about long-term data storage - are they committed to store that data for ~ 10 years, or this was designed just as a short-term storage solution? :)

  2. As long as this service exist we will be able to backup our content in this way. I'm pretty exited of this backup solution, that is not so powerful and easy to use as OneDrive, that is for consumers, but perfect for me.

  3. Flickr offer 1TB for free. Then again, who knows what will happen to it with all the turbulences around yahoo...

    1. Yes, they are a good option if you have only pictures. If you have also documents or other materials, than you might want to use something else.
      There are a lot of options on the market free or cheap. I tried to look for a solution where backup is quartered.


Post a Comment

Popular posts from this blog

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

GET call of REST API that contains '/'-slash character in the value of a parameter

Let’s assume that we have the following scenario: I have a public HTTP endpoint and I need to post some content using GET command. One of the parameters contains special characters like “\” and “/”. If the endpoint is an ApiController than you may have problems if you encode the parameter using the http encoder.
using (var httpClient = new HttpClient()) { httpClient.BaseAddress = baseUrl; Task<HttpResponseMessage> response = httpClient.GetAsync(string.Format("api/foo/{0}", "qwert/qwerqwer"))); response.Wait(); response.Result.EnsureSuccessStatusCode(); } One possible solution would be to encode the query parameter using UrlTokenEncode method of HttpServerUtility class and GetBytes method ofUTF8. In this way you would get the array of bytes of the parameter and encode them as a url token.
The following code show to you how you could write the encode and decode methods.

Entity Framework (EF) TransactionScope vs Database.BeginTransaction

In today blog post we will talk a little about a new feature that is available on EF6+ related to Transactions.
Until now, when we had to use transaction we used ‘TransactionScope’. It works great and I would say that is something that is now in our blood.
using (var scope = new TransactionScope(TransactionScopeOption.Required)) { using (SqlConnection conn = new SqlConnection("...")) { conn.Open(); SqlCommand sqlCommand = new SqlCommand(); sqlCommand.Connection = conn; sqlCommand.CommandText = ... sqlCommand.ExecuteNonQuery(); ... } scope.Complete(); } Starting with EF6.0 we have a new way to work with transactions. The new approach is based on Database.BeginTransaction(), Database.Rollback(), Database.Commit(). Yes, no more TransactionScope.
In the followi…