Skip to main content

Why Storage Modernization Matters for AI

AI requires data that is accessible, organized, governed, and secure. When storage is disorganized or exposed to the internet, AI becomes slow, expensive, and prone to risks. In contrast, when storage is private, structured, and well-described with metadata, AI operates faster, is more cost-effective, and maintains compliance. This focus is not just on sophisticated models; it centers on how we store and transfer data. The foundation shapes the outcomes.

AI alters the risk profile by consuming data rapidly and broadly. It is advisable to treat storage as a private system by default, regularly discover sensitive data, and integrate these insights into your indexing rules. The use of Private Endpoints, combined with Defender for Storage for malware scanning, and applying immutability provides the basic security feature for Azure Storage.  It’s important to implement them from day zero and not to delay them. It is more cost-effective to implement them from the beggining rather than include them after the initial Pilot.


Key Principles for AI's Data Ingestion

1. Use open, analytics-friendly formats (Parquet/JSON)

2. Maintain good metadata (owner, sensitivity, lifecycle)

3. Enforce private network paths and encryption.

4. Plan for tiering and retention to keep costs under control.

5. Incorporate indexing/vector search for retrieval-augmented generation (RAG) and search use cases.


Why, How, and What

On Azure, Blob Storage/ADLS Gen2 works effectively with Azure AI Search and Azure OpenAI. The native blob indexer in AI Search reads documents, extracts text and metadata, and constructs a search index that your AI copilot can query rapidly. It keeps the architecture efficient and minimizes the need for additional code. Connecting that index to Azure OpenAI provides a straightforward pathway to RAG and enterprise Q&A systems.

Network isolation is a significant advantage of using Azure for sensitive applications. Private Endpoints ensure that storage access remains on the Azure backbone, eliminating exposure to the public internet. Most users find this configuration faster, especially when combined with strict firewall rules. It is a straightforward design choice that reduces risk for any AI workload that utilizes blob data.

Security features surrounding the data needs to be robust. Microsoft Defender for Storage provides strong malware scanning, including scanning during content upload, that prevents infected files from entering your AI ecosystem. Azure Blob supports WORM (Write Once, Read Many) policies at the version or container level, which is crucial for regulated environments involving AI.

When model training or extensive experimentation is part of your business, Azure ML Datastores provide good connection between ML pipelines and storage, utilizing managed identities and consistent paths. This reduces complexity related to secrets and ensures data integrity, making it easier when auditors inquire about how training sets were created.

For retrieval layer, many teams start with Azure AI Search for both text and vector queries, while others opt for Cosmos DB for MongoDB when they want vectors stored alongside operational data. Choose the solution that minimizes data movement and operational burden for your team.


 8 Steps for Azure Storage Modernization for AI

1. Select ADLS Gen2 and define a clear folder/prefix taxonomy (by domain/date/region).

2. Secure the network with Private Endpoints and a storage firewall; use RBAC and avoid broad SAS tokens.

3. Implement encryption with your keys (when possible), enable encryption using customer-managed keys (CMK) stored in Key Vault, and set up automatic key rotation.

4. Ensure protection upon upload by activating Defender for Storage for malware scanning and enforce basic content validation.

5. Classify and govern data by scanning with Purview, applying sensitivity labels, and excluding highly sensitive areas from AI indexing.

6. Make the data retrievable by indexing blobs with Azure AI Search (text and vector) and connecting to Azure OpenAI.

7. Control costs and compliance using lifecycle policies (Hot-Cool-Cold-Archive-Delete), enable versioning, and implement immutability (WORM) when needed.

8. Utilize Azure ML Datastores with managed identities, without relying on secrets (access keys), and activate diagnostic logs to Log Analytics for auditing.


My advice is simple: keep your storage clean and private, with clearly defined labels. Add an efficient search/retrieval layer to ensure data is easy to find, then connect your model. With these practices, AI will operate more smoothly and securely on Azure.

Comments

Popular posts from this blog

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

Cloud Myths: Migrating to the cloud is quick and easy (Pill 2 of 5 / Cloud Pills)

The idea that migration to the cloud is simple, straightforward and rapid is a wrong assumption. It’s a common misconception of business stakeholders that generates delays, budget overruns and technical dept. A migration requires laborious planning, technical expertise and a rigorous process.  Migrations, especially cloud migrations, are not one-size-fits-all journeys. One of the most critical steps is under evaluation, under budget and under consideration. The evaluation phase, where existing infrastructure, applications, database, network and the end-to-end estate are evaluated and mapped to a cloud strategy, is crucial to ensure the success of cloud migration. Additional factors such as security, compliance, and system dependencies increase the complexity of cloud migration.  A misconception regarding lift-and-shits is that they are fast and cheap. Moving applications to the cloud without changes does not provide the capability to optimise costs and performance, leading to ...

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills)

Cloud Myths: Cloud is Cheaper (Pill 1 of 5 / Cloud Pills) The idea that moving to the cloud reduces the costs is a common misconception. The cloud infrastructure provides flexibility, scalability, and better CAPEX, but it does not guarantee lower costs without proper optimisation and management of the cloud services and infrastructure. Idle and unused resources, overprovisioning, oversize databases, and unnecessary data transfer can increase running costs. The regional pricing mode, multi-cloud complexity, and cost variety add extra complexity to the cost function. Cloud adoption without a cost governance strategy can result in unexpected expenses. Improper usage, combined with a pay-as-you-go model, can result in a nightmare for business stakeholders who cannot track and manage the monthly costs. Cloud-native services such as AI services, managed databases, and analytics platforms are powerful, provide out-of-the-shelve capabilities, and increase business agility and innovation. H...