Skip to main content

Posts

Showing posts from October, 2025

Why Storage Modernization Matters for AI

AI requires data that is accessible, organized , governed, and secure . When storage is disorganized or exposed to the internet, AI becomes slow, expensive, and prone to risks . In contrast, when storage is private, structured, and well-described with metadata, AI operates faster, is more cost-effective, and maintains compliance . This focus is not just on sophisticated models ; it centers on how we store and transfer data . The foundation shapes the outcomes. AI alters the risk profile by consuming data rapidly and broadly . It is advisable to treat storage as a private system by default, regularly discover sensitive data, and integrate these insights into your indexing rules . The use of Private Endpoints, combined with Defender for Storage for malware scanning, and applying immutability provides the basic security feature for Azure Storage .   It’s important to implement them from day zero and not to delay them . It is more cost-effective to implement them fr...

Copilot & multi-LLM strategy

A few days ago, I listened to one of The Cloud Pod's podcasts, and they mentioned Microsoft's approach regarding Copilot and how Microsoft does not have its own LLM model. I started to dig a little deeper into this topic, as it has high potential regarding training the team to use one 'interface' and behind the scenes to be capable of switching between different LLMs. Many think that 'Microsoft Copilot = one big model from OpenAI'. This was true initially, but today Copilot is more like an air‑traffic controller for AI. It can work with several large language models (LLMs), route your prompt to the right place, and bring back an answer grounded in your work data from Microsoft 365. The important part is which model and how the Copilot system chooses actions and protects your data. What multi‑LLM Copilot means Inside Microsoft 365, Copilot sits on top of an orchestration layer. This orchestrator is the interface between foundation LLMs and the skills and actio...

SELECT FROM communities WHERE capabilities JOIN practices

  I like to think about this trio (Practice, Capabilities, Communities) like a query: precise inputs, clear joins, and an output that matters to the people & business. SELECT Practice is the formal expression of what we sell. It lives at the intersection of commercial need, company offers (services), and the capabilities we've matured because the market or our strategy demands them. When a client asks for Cloud Migration, AI Enablement, or FinOps at scale, that's the Practice answering. Capability is how the work gets done. It's the skill, tooling, and method layer—architecture patterns, IaC, MLOps, data engineering, and SRE working methods. Capabilities power one or many Practices. Community is the human engine, the real force behind the numbers. It forms around shared interests—people who want to learn, try, and trade notes. Communities are where ideas incubate and capabilities grow long before they're ready to be packaged into a Practice. FROM Practices draw fro...