Skip to main content

Where can I backup my content - Azure File Storage

Nowadays taking pictures is so easy, you can end up easily with 20-30 shots per day. After a few years of using external HDD as backups I realize that is a time for a change.
Until now I used OneDrive to backup pictures from my phone and 3 external HDD that were putted in mirror. This setup works great as long you have enough storage and you external HDDs.
But now I ended up with the OneDrive storage at 99.8% full (from 30GB) and with external storage almost full also.

What I need?
I looked up for a solution that is cheap, can be accessed from anywhere I don't need to install custom software on my machine to be able to access it. The ideal solution would be to be able to attach the storage as a partition.
Buying external HDD s is not a solution anymore for me anymore. I would prefer a cloud or storage provider that could offer me a reliable storage for a good price.

Azure Offer
An interesting offer is coming from Microsoft Azure, more exactly from Azure File Storage. The service is part of Azure Storage and allow us to access Azure Storage as a file system. Having the classical folders, files and so on.

Azure File
The concept is simple and powerful. Under your Azure Subscription you can create a Storage Account that can be used to store files under a directory structure. You can upload, access, remove content from any location.
From technical perspective, each 'drive' (if we can call in this way), that you create under Storage Account is called Shared. The maximum size is 5TB, that is more than enough. Under a Share you can have as many directories you want. The number of files is not limited, but the maximum size of a file is limited to 1TB. When you create a new file share you need to specify the quota limit (max size of the file share). This quota can be changed anytime without having to recopy the content.
For my personal needs, this is more and enough. If I would need more than 5TB, than I can create multiple Shares under the same Storage Account.

Price
This is one of the things that attracted me. You have multiple tires that you can use. From 3 replicas in the same Azure Region you can go with more complex ones where content is replicated in another Azure Region where read access is also available (RA-GRS).
Based on how often you access your data there are two types of tires that can be used Cool or Hot tier. Hot tier is used when the content is access and updated often. For backups scenarios, like in my case, Cold storage is a better option. Usually this kind of backups are not accessed often. In my case, I don't read/access all the more often than one time per month.
For this kind of backups the best solution is a Cold Storage that is Geo-Redundant Storage (GRS). This means that storage is replicated in two different Azure Regions. In each Azure Region that are 3 different copies of your content.
When you calculate the cost you need to take into account 3 different things

  • Storage cost (€1.69 per month for 100GB)
  • Transaction cost (€0.16 or €0.0084 for 100.000, based on transaction type)
  • Data Read (€0.0084 per GB) or Data Write (€0.0042 per GB)
For a normal use I calculated that I will never pay more than €1.8 per months for 100GB. This means that the cost per year is under €18, in the context where the content is replicated in two different locations and is accessible from anywhere.

Content access
The content can be accessed directly from Azure Portal, but of course you don't want to do something like this when you have 50.000 files. 
On top of this we have Microsoft Azure Storage Explorer. This nice tool allow us to access content directly from desktop. There is full support for Windows, Linux and Mac. 2The experience is nice and similar with the one that you have from File Explorer.
For access your content from this tools you can use your Azure account credentials or Azure Storage account name and access key. If it is your first time when you use Azure Storage I recommend to check a nice tutorial provided by Microsoft - About Azure storage accounts.

This is not all. You can attach the file storage as a partition on your system. With only one command executed in command line you can do this. This magic can happens because Azure Files Storage has support for SMB 3.0 protocol. This means that you can mount the file share on Windows 7+ or Windows Server 2008+ operation system or Linux machines.
// Windows Machine
net use 
     v: \\zzzz.file.core.windows.net\pictures 
     /u:zzzz 
     azurestoragekey==

// Linux Machine
sudo mount -t 
     cifs //zzz.file.core.windows.net/pictures 
     [mount point] -o vers=3.0,
     username=zzzzz,
     password=azurestoragekey==,
     dir_mode=0777,file_mode=0777

Simple like this, you can attach the Azure File Storage as a partition.

Conclusion
Azure File Storage is a good option if you need to backup you content. The price per month is low and being able to attach it as a partition to my computer makes this solution the best one on the market.

Comments

  1. Sounds too good to be true - I wounder what the terns&conditions say about long-term data storage - are they committed to store that data for ~ 10 years, or this was designed just as a short-term storage solution? :)

    ReplyDelete
  2. As long as this service exist we will be able to backup our content in this way. I'm pretty exited of this backup solution, that is not so powerful and easy to use as OneDrive, that is for consumers, but perfect for me.

    ReplyDelete
  3. Flickr offer 1TB for free. Then again, who knows what will happen to it with all the turbulences around yahoo...

    ReplyDelete
    Replies
    1. Yes, they are a good option if you have only pictures. If you have also documents or other materials, than you might want to use something else.
      There are a lot of options on the market free or cheap. I tried to look for a solution where backup is quartered.

      Delete

Post a Comment

Popular posts from this blog

How to check in AngularJS if a service was register or not

There are cases when you need to check in a service or a controller was register in AngularJS.
For example a valid use case is when you have the same implementation running on multiple application. In this case, you may want to intercept the HTTP provider and add a custom step there. This step don’t needs to run on all the application, only in the one where the service exist and register.
A solution for this case would be to have a flag in the configuration that specify this. In the core you would have an IF that would check the value of this flag.
Another solution is to check if a specific service was register in AngularJS or not. If the service was register that you would execute your own logic.
To check if a service was register or not in AngularJS container you need to call the ‘has’ method of ‘inhector’. It will return TRUE if the service was register.
if ($injector.has('httpInterceptorService')) { $httpProvider.interceptors.push('httpInterceptorService&#…

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine:
threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration:

TeamCity.NET 4.51EF 6.0.2VS2013
It seems that there …

Run native .NET application in Docker (.NET Framework 4.6.2)

Scope
The main scope of this post is to see how we can run a legacy application written in .NET Framework in Docker.

Context
First of all, let’s define what is a legacy application in our context. By a legacy application we understand an application that runs .NET Framework 3.5 or higher in a production environment where we don’t have any more the people or documentation that would help us to understand what is happening behind the scene.
In this scenarios, you might want to migrate the current solution from a standard environment to Docker. There are many advantages for such a migration, like:

Continuous DeploymentTestingIsolationSecurity at container levelVersioning ControlEnvironment Standardization
Until now, we didn’t had the possibility to run a .NET application in Docker. With .NET Core, there was support for .NET Core in Docker, but migration from a full .NET framework to .NET Core can be costly and even impossible. Not only because of lack of features, but also because once you…