For the last decade, the 6 Rs of cloud migration have been used to describe how enterprises should adopt the cloud: Rehost, Replatform, Refactor, Retain, and, sometimes, Retire.
The 6 Rs of cloud migration have guided enterprises in adopting the cloud. However, with AI now central to digital transformation, these Rs alone are no longer sufficient. Cloud migration is just the first step; true AI-Native status requires a deeper cloud-native transformation.
Customers labelling their migrations as Cloud-Native often have applications that still behave like on-premises systems, resulting in manual operations, static systems, and locked data that hinder AI programs.
This is where a new perspective is required to build AI capabilities on top of the 6Rs.
Pure cloud-native solutions are difficult for large enterprises. Realistically, we need to identify gaps and what is needed to prepare for AI integration.
In the next part of the article, each R will be analysed in terms of AI-Native needs.
Rehost
The workloads are moved, but nothing changes inside the system. Rehost does not provide enough ground for a structural change, but 3 items can be achieved easily:
- Telemetry to instrument the platform and get better insights
- GPU to provide capabilities to run AI workloads
- FinOps to track and get visibility regarding costs
This approach enables current and future workloads to adopt AIOps and provides a ground for AI.
AWS and Microsoft Azure already have this tool and are ready for use in a Rehost scenario.
- Azure Application Insights with Smart Detection, or DevOps Guru, can automatically detect abnormal workload patterns.
- Azure Monitor with dynamic thresholds or CloudWatch outlier detection can be used to learn from past metrics and use ML to identify and react to anomalies.
Replatform
Workloads, storage, and databases are moved to native solutions, but without near-real-time integration or a lake around them. Replatform provides limited capabilities for streaming, event integration, and data fabric, which are required by GenAI solutions.
To be closer to AI-Native, the Replatform should include:
- Streaming and event-based capabilities for AI ingestion
- Data fabric for analytics and ML operations
- Hooks to existing data stream using solutions like Azure Synapse Link to avoid affecting current business flows.
Refactor
Breaking the monolith is no longer enough for AI. Having a microservice architecture that is fast and scalable responds to your business needs, but not to AI. The application that we build needs to consume and expose to AI services natively.
The outcome of the refactoring should provide a modular approach, with APIs and data streams as first-class citizens. That enables AI components to plug into the existing system to observe, predict and even optimise when possible.
Microsoft provides these capabilities by combining AKS, Azure Functions and Event Hub to the OpenAI Service stack and Foundry.
Repurchase
The system replacing the existing ones should not only be SaaS but also have AI-native capabilities. Depending on the maturity level, capabilities may be limited, providing only hooks via APIs or data streaming. For more mature solutions, vector databases and native integration with AI platforms are desired, enabling us to build flows between the two systems.
Salesforce Einstein GPT is a good example that can be combined with Azure OpenAI, Azure AI Search, and AI Agents to build an intelligent layer.
Retain
Not all the systems will reside on the cloud, and there are good reasons for this. It does not mean that this system needs to be isolated from the rest of the ecosystem and AI capabilities.
An AI-native approach, empowered by the cloud, can bring these capabilities closer by using APIs and hybrid gateways that fetch information from on-premises systems and provide intelligence.
AWS Outposts, Azure Stack HCI, combined with Bedrock Agents or Azure AI Agent Service, are just 2 examples of how on-premises systems can be part of your AI strategy.
Retire
There is not too much that you can do when you retire, except one thing. You can reinvest the savings in AI capabilities and fund AI adoption programs within your organisation. This can trigger a domino effect and generate additional savings or business in the end.
The 6 Rs are essential and relevant in the AI-Native context. It is important to know how to address them from the AI perspective. The shift we are making from managing infrastructure to managing intelligence forces us to approach areas like resource optimisation, make autonomous decisions, and autonomously react to triggers. In the end, we are on a journey to reimagine how the platform behaves and operates.

Comments
Post a Comment