Skip to main content

Posts

Showing posts from February, 2025

Cloud Modernization for AI: Serverless and Containerization (Pill 3 of 5 / Cloud Pills)

AI is reshaping the way we build and run businesses in all industries. One challenge with AI models is scaling. Traditional infrastructure is not built for the dynamic scaling required by AI models during training or operation.   To optimise cost and reduce operational overhead, a modern approach combining serverless and microservices to provide a flexible, scalable, and efficient workload layer is required. Microsoft Azure enables these two mechanisms through Azure Functions and Azure Kubernetes Services. Serveless is required for AI deployments, especially because of unpredictable demands. The capability of running a function triggered by an AI agent in response to an event without the overhead of deployments is crucial for multi-agent AI solutions. Serverless is needed for real-time image recognition, language translation and dynamic execution of payloads triggered by APIs, data streams and IoT devices. Another advantage of a serverless approach is agility and the abilit...

GitHub Copilot Saving in IT Projects

 Like other AI-assisted tools for developing applications, GitHub Copilot has become part of our lifecycle of developing IT solutions. It is not a question of whether Git Hub Copilot brings value to a developer or DevOps. The question is, what is the impact of quality attributes, effort and budget-wide? This article assesses the impact of GitHub Copilot from a budget perspective. The intention is to identify how much the cost of building and running a cloud solution is reduced when the technical team has access to GitHub Copilot.   Over the internet, a lot of studies have focused on developer experience, the quality of the outcome, and how much of the code generated by GitHub Copilot ends up being used in production. To estimate the impact of the required budget to develop a greenfield cloud application running on Microsoft Azure, we used these studies as a reference, calculating the impact of an AI-assisted tool in the day-to-day work of a technical perso...

Cloud Modernization for AI: Data and Workflows (Pill 2 of 5 / Cloud Pills)

 AI is everywhere! To make an AI system work, run and provide us assistance, we need to provide the fuel for AI. To ensure that you have a working AI system, first, you must ensure that your data workflows run smoothly and are scalable for the high demand of the AI components. The challenge is not just to provide access to all your data (in a cheap data store). The real challenge is to modernise and optimise the entire lifecycle of your data, from ingestion to storage and processing. Ultimately, you need to ensure that you have a system with good performance and controlled (or less) bottlenecks. Vendors like Microsoft Azure provide a set of tools, cloud services, and data infrastructure designed to complement AI solutions. One of the first challenges that needs to be addressed is data storage. As we know, AI systems require access to a large amount of data in different formats (structured and unstructured). A storage strategy that can manage the payload generated by AI applicatio...

Cloud Modernization for AI: Scalability and Performance Jams (Pill 1 of 5 / Cloud Pills)

Cloud! When we think of the cloud, our mind goes to unlimited scalability and power, using an efficient system that can automatically respond to demand. The reality is not like this, mainly because most of the systems were migrated as is to the cloud or built by teams where the cloud skills were limited. Cloud modernization is not part of the lifecycle of systems that run inside the cloud, meaning that the same cloud services are used by a system after 5-10 years. Performance bottlenecks emerge when AI is added to systems because of inefficient resource allocations, architectural decisions, and a lack of understanding of the demand of the AI model.  The truth is that most of the systems that run in the cloud nowadays are not ready for the demand for an AI service. Microsoft, like other vendors, provides paths to navigate in the AI journey that requires customers to adopt new cloud technologies or change the way how the system works. One of the mistakes made by organizations is...

Azure FrontDoor supports WebSockets

  In fast-changing times like today, real-time communication assures uninterrupted user experiences and increases business operations. Low-latency bi-directional communication keeps the users interested, like in financial trading, live chat, co-working systems, or online gaming. With the help of the Azure Front Door Standard and WebSockets, businesses scale their real-time applications efficiently around the globe. Azure Front Door represents the CDN in the cloud, improving performance, security, and reliability. Now, with WebSockets, this brings new possibilities for enterprises by offering global scalability and high availability, enhanced security, optimized performance and seamless Azure integration. WebSockets operate distinctively from regular HTTP, facilitating communication between clients and servers through a persistent connection with full duplexing on top of TCP. Establishing a connection for HTTP is not required. This efficient operation of WebSockets significantly im...