Skip to main content

Cloud (re)patriation and Hybrid Cloud

 The first cloud migration dates from early 2000, when the first cloud provider emerged. Since then, cloud adoption has become part of business, helping companies grow and adopt emerging technologies. Cloud computing has changed businesses' operations, providing agility, efficiency, and scalability. 

Cloud repatriation started to emerge in the last few years. It involved moving data or business back to on-premises. According to the ESG group, over 60% of companies have repatriated at least one workload back to on-premise.

A well-known example of cloud repatriation is Dropbox. In 2016, they decided to shift the workloads from AWS to on-premises data centres driven by economics, better operational flexibility and control. From the cloud adoption point of view, cloud repatriation was part of their cloud journey. Today, Dropbox has multiple cloud vendors and on-premises partners that cover business-specific needs. 

The leading causes of cloud repatriation

The top concerns related to the cloud that force companies to make this strategic shift are related to cost, efficiency, compliance, security and control. 

Cloud was adopted because of the cost-saving potential, yet the 'bill shock' and lack of visibility and control over the expenses often lead to cloud repatriation. 

The strong security level provided by the cloud is not always enough for industries where data security and compliance are strict. From this perspective, cloud repatriation is a mechanism to mitigate security risk and align with compliance regulations. 

The customization, control, and performance level are limited in a cloud environment, especially for unique businesses where a one-size-fits-all approach does not fit all. The latency introduced by the cloud can affect critical applications, leading to performance degradation. Businesses might prefer to run their application on the edge network or locally to ensure high performance.

Vendor lock-in concerns related to cloud services, migration complexity, effort and business disruption can cause businesses to rethink their operations and data strategy. 

Potential pitfalls of cloud repatriation

The most commonly underestimated component is infrastructure cost. Rebuilding and maintaining an on-premises data centre, including maintenance, software and hardware costs, can be much higher than the initial estimation. The additional human resources necessary to run an on-premises solution might no longer be present.

The cost of data migration back to on-premises is not a simple process that is often not planned carefully. Underestimation can lead to data loss, business disruption, and additional costs. Cloud platforms have built-in disaster and recovery capabilities that are taken for granted by organizations. During repatriation, the disaster and recovery features need to be redesigned with a direct impact on cost and complexity.

Data compliance can be challenging for on-premises solutions to comply with regulatory standards. Regular audits increase the running costs and the effort required to maintain the data and storage layer. 

We all know that the cloud's cost function is complex and challenging to manage. Focusing only on the cost factor and ignoring the business benefits of the cloud can lead to strategic mistakes. The cloud provides speed, flexibility, and fast access to new technologies. Besides this, scalability and flexibility are among the strengths of the cloud.

Cloud repatriation evaluation

The cloud repatriation assessment needs to analyze and consider all the key aspects that can impact cloud repatriation. 

A comprehensive cost-benefit analysis across all cloud accounts and the potential investment required for on-site datacenter, including hardware, licenses, upgrades and power, are essential for cost-benefit analysis. 

Performance evaluation from the business requirements point of view is essential. When components like latency impact the user experience or operational efficiency, cloud repatriation could be considered. 

Vendor lock-in, technology flexibility, and transition to another provider's requirements favour repatriation and can help the customer regain flexibility.

Repatriation provides much better control for businesses that operate in highly regulated industries or when data sovereignty requirements are strict. Businesses with a high level of control and customization of the system would prefer on-premises configurations. Challenges related to skill availability during repatriation can be real problems that can influence costs and hiring strategy.

One common question related to cloud repatriation is related to cost and which is cheaper. Depending on the organization-specific context and type of business, repatriation could be cheaper or more expensive. Cloud repatriation can be cheaper when data transfer, workload consistency, and long-term perspective are relevant. However, high scalability, low maintenance cost and cutting-edge technologies can easily be accessed in the cloud. 

Conclusion

Cloud repatriation is not an all-or-nothing decision. A hybrid approach is a much better strategy, where both methods are used and workloads are moved from on-premises to cloud and vice-versa, depending on the business driver. Business needs should drive all decisions related to moving to the cloud or repatriation. 

Each organization must evaluate its unique needs, operational expectations, potential growth, budget and technical expectations and identify the best investment. Cloud repatriation is part of the cloud journey and reminds us that decisions should not be binary. A hybrid cloud strategy driven by operational needs should ensure a cloud adoption journey. Migration to cloud or on-premises should be seen as an ongoing process that evolves in time together with the business. 

Comments

Popular posts from this blog

Windows Docker Containers can make WIN32 API calls, use COM and ASP.NET WebForms

After the last post , I received two interesting questions related to Docker and Windows. People were interested if we do Win32 API calls from a Docker container and if there is support for COM. WIN32 Support To test calls to WIN32 API, let’s try to populate SYSTEM_INFO class. [StructLayout(LayoutKind.Sequential)] public struct SYSTEM_INFO { public uint dwOemId; public uint dwPageSize; public uint lpMinimumApplicationAddress; public uint lpMaximumApplicationAddress; public uint dwActiveProcessorMask; public uint dwNumberOfProcessors; public uint dwProcessorType; public uint dwAllocationGranularity; public uint dwProcessorLevel; public uint dwProcessorRevision; } ... [DllImport("kernel32")] static extern void GetSystemInfo(ref SYSTEM_INFO pSI); ... SYSTEM_INFO pSI = new SYSTEM_INFO(

Azure AD and AWS Cognito side-by-side

In the last few weeks, I was involved in multiple opportunities on Microsoft Azure and Amazon, where we had to analyse AWS Cognito, Azure AD and other solutions that are available on the market. I decided to consolidate in one post all features and differences that I identified for both of them that we should need to take into account. Take into account that Azure AD is an identity and access management services well integrated with Microsoft stack. In comparison, AWS Cognito is just a user sign-up, sign-in and access control and nothing more. The focus is not on the main features, is more on small things that can make a difference when you want to decide where we want to store and manage our users.  This information might be useful in the future when we need to decide where we want to keep and manage our users.  Feature Azure AD (B2C, B2C) AWS Cognito Access token lifetime Default 1h – the value is configurable 1h – cannot be modified

ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded

Today blog post will be started with the following error when running DB tests on the CI machine: threw exception: System.InvalidOperationException: The Entity Framework provider type 'System.Data.Entity.SqlServer.SqlProviderServices, EntityFramework.SqlServer' registered in the application config file for the ADO.NET provider with invariant name 'System.Data.SqlClient' could not be loaded. Make sure that the assembly-qualified name is used and that the assembly is available to the running application. See http://go.microsoft.com/fwlink/?LinkId=260882 for more information. at System.Data.Entity.Infrastructure.DependencyResolution.ProviderServicesFactory.GetInstance(String providerTypeName, String providerInvariantName) This error happened only on the Continuous Integration machine. On the devs machines, everything has fine. The classic problem – on my machine it’s working. The CI has the following configuration: TeamCity .NET 4.51 EF 6.0.2 VS2013 It see