Skip to main content

How to secure access to Azure Management Portal

Migrating an existing system to Microsoft Azure can be a challenge when strong security requirements need to be in place. There are multiple tools and services that help us to provide a secure environments that it is not in our own datacenter.
Often we are focusing to secure the application and systems that are running inside Azure and forgetting about the Azure Portal itself. We can have the most secure application inside Azure if our subscription is compromised and someone is able to access it as admin.
In this post, we will look on some basic mechanisms and policies that enable us to secure access to the Azure Portal and subscription.

Channels
There two ways to access an Azure Subscription and to any kind of changes.
1. Azure Portal: The main dashboard used to administrate our Azure Subscription
2. SMAPI: Over the Rest API exposed by Microsoft, command-line interfaces like PowerShell, Batch can be used with success or even custom applications

Multi-factor authentication
One of the first things that can be done to the accounts that are accessing Azure Subscription is to activate multi-factor authentication. Additional to the password, the use will be required to do a second authentication step using SMS, phone number and so on.

Smart Cards
Account can be secured using a Smart Card or similar devices that are based on TPM (Trusted Platform Module) chips. They are using an asymmetric key pair or certificate where the secret (private key) is stored on the device.

Azure Management Certificates
These certificates are generated by Azure and can be used by different system to access and manage Azure Subscription. These certificates enable us to manage the subscription without having to enter a username and password.
They are very powerful especially when you want to use in combination with a Remote Desktop Gateway hosted inside Azure and a VPN.

Connection Authorization Policies
These policies can be used when Remote Desktop Gateway is used and can ensure that the machine that connect to it, it is part of a specific domain and even check and validate the machine name. 

Point-to-Site VPN
To ensure a more secure connection between client machines and Azure, a P2S VPN connection can be used to a machine hosted inside Azure. This would enable us to have a more communication channels between them. This is similar with having a Jump Box machine inside Azure.
Don’t forget that all communication with Azure management endpoints are over TLS.

Site-to-Site VPN
By using S2S VPN, your on-premises system is connected with your VNET network from Azure. As P2S VPN, this offers us a secure tunnel that can be used to communicate with Azure. A jump-box is not needed anymore; it is enough to configure traffic to Azure to goes over the S2S VPN directly.

NSG (Network Security Groups) and IP Restrictions
When a jump box with(out) a VPN connection it is used, NGS and IP Restrictions can be used to restrict access to the jump box.
Express Route
Express route is offering a dedicated and secure communication channel between your on-premises system and Azure. The private connection with Azure ensures are that there are no man in the middles.

Restrict number of users
Try to keep the number of users that have access to the subscription as low as possible. Many times 2-3 people are more than enough to have admin/co-admin rights.

Jump-box with Azure Management Certificates
This enable us to provide limited access to the Azure Portal through Remote Desktop Gateway without sharing with the user the credentials. The certificate that it is installed on the jump-box will provide access to Azure Management API.
This requires of course extra security measure at jump-box level.

Conclusion
Many feature and capabilities enable us to secure access to Azure Portal and Management API. Beside these, don’t forget that in the most of the cases people are the breach and not the system itself.

Comments

Popular posts from this blog

Why Database Modernization Matters for AI

  When companies transition to the cloud, they typically begin with applications and virtual machines, which is often the easier part of the process. The actual complexity arises later when databases are moved. To save time and effort, cloud adoption is more of a cloud migration in an IaaS manner, fulfilling current, but not future needs. Even organisations that are already in the cloud find that their databases, although “migrated,” are not genuinely modernised. This disparity becomes particularly evident when they begin to explore AI technologies. Understanding Modernisation Beyond Migration Database modernisation is distinct from merely relocating an outdated database to Azure. It's about making your data layer ready for future needs, like automation, real-time analytics, and AI capabilities. AI needs high throughput, which can be achieved using native DB cloud capabilities. When your database runs in a traditional setup (even hosted in the cloud), in that case, you will enc...

How to audit an Azure Cosmos DB

In this post, we will talk about how we can audit an Azure Cosmos DB database. Before jumping into the problem let us define the business requirement: As an Administrator I want to be able to audit all changes that were done to specific collection inside my Azure Cosmos DB. The requirement is simple, but can be a little tricky to implement fully. First of all when you are using Azure Cosmos DB or any other storage solution there are 99% odds that you’ll have more than one system that writes data to it. This means that you have or not have control on the systems that are doing any create/update/delete operations. Solution 1: Diagnostic Logs Cosmos DB allows us activate diagnostics logs and stream the output a storage account for achieving to other systems like Event Hub or Log Analytics. This would allow us to have information related to who, when, what, response code and how the access operation to our Cosmos DB was done. Beside this there is a field that specifies what was th...

[Post Event] Azure AI Connect, March 2025

On March 13th, I had the opportunity to speak at Azure AI Connect about modern AI architectures.  My session focused on the importance of modernizing cloud systems to efficiently handle the increasing payload generated by AI.