Skip to content

Azure Services

How to check if a role permission is good or bad in Azure RBAC

Do you need to check if a role permission is good or bad or just don't know what actions a provider has in Azure RBAC?

Get-AzProviderOperation * is your friend and you can always export everything to csv:

Get-AzProviderOperation | select Operation , OperationName , ProviderNamespace , ResourceName , Description , IsDataAction | Export-csv AzProviderOperation.csv

This command will give you a list of all the operations that you can perform on Azure resources, including the operation name, provider namespace, resource name, description, and whether it is a data action or not. You can use this information to check if a role permission is good or bad, or to find out what actions a provider has in Azure RBAC.

Script to check if a role permission is good or bad on tf files

You can use the following script to check if a role permission is good or bad on tf files:

<#
.SYNOPSIS
Script to check if a role permission is good or bad in Azure RBAC using Terraform files.

.DESCRIPTION
This script downloads Azure provider operations to a CSV file, reads the CSV file, extracts text from Terraform (.tf and .tfvars) files, and compares the extracted text with the CSV data to find mismatches.

.PARAMETER csvFilePath
The path to the CSV file where Azure provider operations will be downloaded.

.PARAMETER tfFolderPath
The path to the folder containing Terraform (.tf and .tfvars) files.

.PARAMETER DebugMode
Switch to enable debug mode for detailed output.

.EXAMPLE
.\Check-RBAC.ps1 -csvFilePath ".\petete.csv" -tfFolderPath ".\"

.EXAMPLE
.\Check-RBAC.ps1 -csvFilePath ".\petete.csv" -tfFolderPath ".\" -DebugMode

.NOTES
For more information, refer to the following resources:
- Azure RBAC Documentation: https://docs.microsoft.com/en-us/azure/role-based-access-control/
- Get-AzProviderOperation Cmdlet: https://docs.microsoft.com/en-us/powershell/module/az.resources/get-azprovideroperation
- Export-Csv Cmdlet: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/export-csv
#>

param(
    [string]$csvFilePath = ".\petete.csv",
    [string]$tfFolderPath = ".\",
    [switch]$DebugMode
)

# Download petete.csv using Get-AzProviderOperation
function Download-CSV {
    param(
        [string]$filename,
        [switch]$DebugMode
    )
    if ($DebugMode) { Write-Host "Downloading petete.csv using Get-AzProviderOperation" }
    Get-AzProviderOperation | select Operation, OperationName, ProviderNamespace, ResourceName, Description, IsDataAction | Export-Csv -Path $filename -NoTypeInformation
    if ($DebugMode) { Write-Host "CSV file downloaded: $filename" }
}

# Function to read the CSV file
function Read-CSV {
    param(
        [string]$filename,
        [switch]$DebugMode
    )
    if ($DebugMode) { Write-Host "Reading CSV file: $filename" }
    $csv = Import-Csv -Path $filename
    $csvData = $csv | ForEach-Object {
        [PSCustomObject]@{
            Provider = $_.Operation.Split('/')[0].Trim()
            Operation = $_.Operation
            OperationName = $_.OperationName
            ProviderNamespace = $_.ProviderNamespace
            ResourceName = $_.ResourceName
            Description = $_.Description
            IsDataAction = $_.IsDataAction
        }
    }
    if ($DebugMode) { Write-Host "Data read from CSV:"; $csvData | Format-Table -AutoSize }
    return $csvData
}

# Function to extract text from the Terraform files
function Extract-Text-From-TF {
    param(
        [string]$folderPath,
        [switch]$DebugMode
    )
    if ($DebugMode) { Write-Host "Reading TF and TFVARS files in folder: $folderPath" }
    $tfTexts = @()
    $files = Get-ChildItem -Path $folderPath -Filter *.tf,*.tfvars
    foreach ($file in $files) {
        $content = Get-Content -Path $file.FullName
        $tfTexts += $content | Select-String -Pattern '"Microsoft\.[^"]*"' -AllMatches | ForEach-Object { $_.Matches.Value.Trim('"').Trim() }
        $tfTexts += $content | Select-String -Pattern '"\*"' -AllMatches | ForEach-Object { $_.Matches.Value.Trim('"').Trim() }
        $tfTexts += $content | Select-String -Pattern '^\s*\*/' -AllMatches | ForEach-Object { $_.Matches.Value.Trim() }
    }
    if ($DebugMode) { Write-Host "Texts extracted from TF and TFVARS files:"; $tfTexts | Format-Table -AutoSize }
    return $tfTexts
}

# Function to compare extracted text with CSV data
function Compare-Text-With-CSV {
    param(
        [array]$csvData,
        [array]$tfTexts,
        [switch]$DebugMode
    )
    $mismatches = @()
    foreach ($tfText in $tfTexts) {
        if ($tfText -eq "*" -or $tfText -match '^\*/') {
            continue
        }
        $tfTextPattern = $tfText -replace '\*', '.*'
        if (-not ($csvData | Where-Object { $_.Operation -match "^$tfTextPattern$" })) {
            $mismatches += $tfText
        }
    }
    if ($DebugMode) { Write-Host "Mismatches found:"; $mismatches | Format-Table -AutoSize }
    return $mismatches
}

# Main script execution
Download-CSV -filename $csvFilePath -DebugMode:$DebugMode
$csvData = Read-CSV -filename $csvFilePath -DebugMode:$DebugMode
$tfTexts = Extract-Text-From-TF -folderPath $tfFolderPath -DebugMode:$DebugMode
$mismatches = Compare-Text-With-CSV -csvData $csvData -tfTexts $tfTexts -DebugMode:$DebugMode

if ($mismatches.Count -eq 0) {
    Write-Host "All extracted texts match the CSV data."
} else {
    Write-Host "Mismatches found:"
    $mismatches | Format-Table -AutoSize
}

This script downloads Azure provider operations to a CSV file, reads the CSV file, extracts text from Terraform files, and compares the extracted text with the CSV data to find mismatches. You can use this script to check if a role permission is good or bad in Azure RBAC using Terraform files.

I hope this post has given you a good introduction to how you can check if a role permission is good or bad in Azure RBAC and how you can use Terraform files to automate this process.

Happy coding! 🚀

Data threat modeling in Azure storage accounts

Info

I apologize in advance if this is a crazy idea and there is some mistake!! I am just trying to learn and share knowledge.

Azure Storage Account is a service that provides scalable, secure, and reliable storage for data. It is used to store data such as blobs, files, tables, and queues. However, it is important to ensure that the data stored in Azure Storage Account is secure and protected from security threats. In this article, we will discuss how to perform data threat modeling in Azure storage accounts.

What is data threat modeling?

Data threat modeling is a process of identifying and analyzing potential threats to data security. It helps organizations understand the risks to their data and develop strategies to mitigate those risks. Data threat modeling involves the following steps:

  1. Identify assets: Identify the data assets stored in Azure storage accounts, such as blobs, files, tables, and queues.
  2. Identify threats: Identify potential threats to the data assets, such as unauthorized access, data breaches, data leakage, malware, phishing attacks, insider threats, and data loss.
  3. Assess risks: Assess the risks associated with each threat, such as the likelihood of the threat occurring and the impact of the threat on the data assets.
  4. Develop mitigation strategies: Develop strategies to mitigate the risks, such as implementing security controls, access controls, encryption, monitoring, and auditing.

By performing data threat modeling, organizations can identify and address security vulnerabilities in Azure storage accounts and protect their data from security threats.

Identify assets in Azure storage accounts

Azure storage accounts can store various types of data assets, including:

  • Blobs: Binary large objects (blobs) are used to store unstructured data, such as images, videos, and documents.
  • Files: Files are used to store structured data, such as text files, configuration files, and log files.
  • Tables: Tables are used to store structured data in a tabular format, such as customer information, product information, and transaction data.
  • Queues: Queues are used to store messages for communication between applications, such as task messages, notification messages, and status messages.
  • Disks: Disks are used to store virtual machine disks, such as operating system disks and data disks.

Identifying the data assets stored in Azure storage accounts is the first step in data threat modeling. It helps organizations understand the types of data stored in Azure storage accounts and the potential risks to those data assets.

Identify threats to data in Azure storage accounts

There are several threats to data stored in Azure storage accounts, including:

  • Unauthorized access: Unauthorized users gaining access to Azure storage accounts and stealing data.
  • Data breaches: Data breaches can expose sensitive data stored in Azure storage accounts.
  • Data leakage: Data leakage can occur due to misconfigured access controls or insecure data transfer protocols.
  • Data loss: Data loss can occur due to accidental deletion, corruption, or hardware failure.
  • Ransomware: Ransomware can encrypt data stored in Azure storage accounts and demand a ransom for decryption.
  • DDoS attacks: DDoS attacks can disrupt access to data stored in Azure storage accounts.
  • Phishing attacks: Phishing attacks can trick users into providing their login credentials, which can be used to access and steal data.
  • Malware: Malware can be used to steal data from Azure storage accounts and transfer it to external servers.
  • Insider threats: Employees or contractors with access to sensitive data may intentionally or unintentionally exfiltrate data.
  • Data exfiltration: Unauthorized transfer of data from Azure storage accounts to external servers.

For example, the flow of data exfiltration in Azure storage accounts can be summarized as follows:

sequenceDiagram
    participant User
    participant Azure Storage Account
    participant External Server

    User->>Azure Storage Account: Upload data
    Azure Storage Account->>External Server: Unauthorized transfer of data

In this flow, the user uploads data to the Azure Storage Account, and the data is then transferred to an external server without authorization. This unauthorized transfer of data is known as data exfiltration.

Assess risks to data in Azure storage accounts

Assessing the risks associated with threats to data in Azure storage accounts is an important step in data threat modeling. Risks can be assessed based on the likelihood of the threat occurring and the impact of the threat on the data assets. Risks can be categorized as low, medium, or high based on the likelihood and impact of the threat.

For example, the risk of unauthorized access to Azure storage accounts may be categorized as high if the likelihood of unauthorized access is high and the impact of unauthorized access on the data assets is high. Similarly, the risk of data leakage may be categorized as medium if the likelihood of data leakage is medium and the impact of data leakage on the data assets is medium.

By assessing risks to data in Azure storage accounts, organizations can prioritize security measures and develop strategies to mitigate the risks.

For example, the risk of data exfiltration in Azure storage accounts can be assessed as follows:

pie
    title Data Exfiltration Risk Assessment
    "Unauthorized Access" : 30
    "Data Breaches" : 20
    "Data Leakage" : 15
    "Malware" : 10
    "Phishing Attacks" : 10
    "Insider Threats" : 15

Develop mitigation strategies for data in Azure storage accounts

Developing mitigation strategies is an essential step in data threat modeling. Mitigation strategies help organizations protect their data assets from security threats and reduce the risks associated with those threats. Mitigation strategies could include the following:

  1. Implement access controls: Implement access controls to restrict access to Azure storage accounts based on user roles and permissions.
  2. Encrypt data: Encrypt data stored in Azure storage accounts to protect it from unauthorized access.
  3. Monitor and audit access: Monitor and audit access to Azure storage accounts to detect unauthorized access and data exfiltration.
  4. Implement security controls: Implement security controls, such as firewalls, network security groups, and intrusion detection systems, to protect data in Azure storage accounts.
  5. Use secure transfer protocols: Use secure transfer protocols, such as HTTPS, to transfer data to and from Azure storage accounts.
  6. Implement multi-factor authentication: Implement multi-factor authentication to protect user accounts from unauthorized access.
  7. Train employees: Train employees on data security best practices to prevent data exfiltration and other security threats.
  8. Backup data: Backup data stored in Azure storage accounts to prevent data loss due to accidental deletion or corruption.
  9. Update software: Keep software and applications up to date to protect data stored in Azure storage accounts from security vulnerabilities.
  10. Implement data loss prevention (DLP) policies: Implement DLP policies to prevent data leakage and unauthorized transfer of data from Azure storage accounts.

As it is not an easy task, Microsoft provides us with tools for this, in the case of using a security framework we can always use the MCSB (Microsoft Cloud Security Baseline) which is a set of guidelines and best practices for securing Azure services, including Azure storage accounts. The MCSB provides recommendations for securing Azure storage accounts, such as enabling encryption, implementing access controls, monitoring access, and auditing activities:

Control Domain ASB Control ID ASB Control Title Responsibility Feature Name
Asset Management AM-2 Use only approved services Customer Azure Policy Support
Backup and recovery BR-1 Ensure regular automated backups Customer Azure Backup
Backup and recovery BR-1 Ensure regular automated backups Customer Service Native Backup Capability
Data Protection DP-1 Discover, classify, and label sensitive data Customer Sensitive Data Discovery and Classification
Data Protection DP-2 Monitor anomalies and threats targeting sensitive data Customer Data Leakage/Loss Prevention
Data Protection DP-3 Encrypt sensitive data in transit Microsoft Data in Transit Encryption
Data Protection DP-4 Enable data at rest encryption by default Microsoft Data at Rest Encryption Using Platform Keys
Data Protection DP-5 Use customer-managed key option in data at rest encryption when required Customer Data at Rest Encryption Using CMK
Data Protection DP-6 Use a secure key management process Customer Key Management in Azure Key Vault
Identity Management IM-1 Use centralized identity and authentication system Microsoft Azure AD Authentication Required for Data Plane Access
Identity Management IM-1 Use centralized identity and authentication system Customer Local Authentication Methods for Data Plane Access
Identity Management IM-3 Manage application identities securely and automatically Customer Managed Identities
Identity Management IM-3 Manage application identities securely and automatically Customer Service Principals
Identity Management IM-7 Restrict resource access based on conditions Customer Conditional Access for Data Plane
Identity Management IM-8 Restrict the exposure of credential and secrets Customer Service Credential and Secrets Support Integration and Storage in Azure Key Vault
Logging and threat detection LT-1 Enable threat detection capabilities Customer Microsoft Defender for Service / Product Offering
Logging and threat detection LT-4 Enable network logging for security investigation Customer Azure Resource Logs
Network Security NS-2 Secure cloud services with network controls Customer Disable Public Network Access
Network Security NS-2 Secure cloud services with network controls Customer Azure Private Link
Privileged Access PA-7 Follow just enough administration(least privilege) principle Customer Azure RBAC for Data Plane
Privileged Access PA-8 Choose approval process for third-party support Customer Customer Lockbox

And part of MCSB can be complemented with Azure Well Architected Framework, which provides guidance on best practices for designing and implementing secure, scalable, and reliable cloud solutions. The Well Architected Framework includes security best practices for Azure storage accounts, such as implementing security controls, access controls, encryption, monitoring, and auditing:

  1. Enable Azure Defender for all your storage accounts: Azure Defender for Storage provides advanced threat protection for Azure storage accounts. It helps detect and respond to security threats in real-time.
  2. Turn on soft delete for blob data: Soft delete helps protect your blob data from accidental deletion. It allows you to recover deleted data within a specified retention period.
  3. Use Microsoft Entitlement Management to authorize access to blob data: Microsoft Entitlement Management provides fine-grained access control for Azure storage accounts. It allows you to define and enforce access policies based on user roles and permissions.
  4. Consider the principle of least privilege: When assigning permissions to a Microsoft Entitlement security principal through Azure RBAC, follow the principle of least privilege. Only grant the minimum permissions required to perform the necessary tasks.
  5. Use managed identities to access blob and queue data: Managed identities provide a secure way to access Azure storage accounts without storing credentials in your code.
  6. Use blob versioning or immutable blobs: Blob versioning and immutable blobs help protect your business-critical data from accidental deletion or modification.
  7. Restrict default internet access for storage accounts: Limit default internet access to Azure storage accounts to prevent unauthorized access.
  8. Enable firewall rules: Use firewall rules to restrict network access to Azure storage accounts. Only allow trusted IP addresses to access the storage account.
  9. Limit network access to specific networks: Limit network access to specific networks or IP ranges to prevent unauthorized access.
  10. Allow trusted Microsoft services to access the storage account: Allow only trusted Microsoft services to access the storage account to prevent unauthorized access.
  11. Enable the Secure transfer required option: Enable the Secure transfer required option on all your storage accounts to enforce secure connections.
  12. Limit shared access signature (SAS) tokens to HTTPS connections only: Limit shared access signature (SAS) tokens to HTTPS connections only to prevent unauthorized access.
  13. Avoid using Shared Key authorization: Avoid using Shared Key authorization to access storage accounts. Use Azure AD or SAS tokens instead.
  14. Regenerate your account keys periodically: Regenerate your account keys periodically to prevent unauthorized access.
  15. Create a revocation plan for SAS tokens: Create a revocation plan and have it in place for any SAS tokens that you issue to clients. This will help you revoke access to the storage account if necessary.
  16. Use near-term expiration times on SAS tokens: Use near-term expiration times on impromptu SAS, service SAS, or account SAS to limit the exposure of the token.

Mixed strategies for data protection in Azure storage accounts

Diagram of the mixed strategies for data protection in Azure storage accounts:

graph LR
    A[Asset Management] -->B(AM-2)
    B --> C[Use only approved services]
    C --> D[Azure Policy] 
    E[Backup and recovery] -->F(BR-1)
    F --> G[Ensure regular automated backups]
    G --> H[Azure Backup]
    G --> I[Service Native Backup Capability]
    I --> I1["Azure Storage Account Configuration"]
    I1 --> I11["Turn on soft delete for blob data"]
    I1 --> I12["Use blob versioning or immutable blobs"]
graph LR
    J[Data Protection] -->K(DP-1)
    K --> L[Discover, classify, and label sensitive data]    
    L --> M[Sensitive Data Discovery and Classification]
    M --> M1["Microsoft Pureview"]       
    J --> N(DP-2)
    N --> O[Monitor anomalies and threats targeting sensitive data]
    O --> P[Data Leakage/Loss Prevention]
    P --> P1["Microsoft Defender for Storage"]
    J --> Q(DP-3)
    Q --> R[Encrypt sensitive data in transit]
    R --> S[Data in Transit Encryption]
    S --> S1["Azure Storage Account Configuration"]
    S1 --> S2["Enforce minimum TLS version"]
    J --> T(DP-4)
    T --> U[Enable data at rest encryption by default]
    U --> V[Data at Rest Encryption Using Platform Keys]
    V --> WW["Azure Storage Account Configuration"]    
    J --> W(DP-5)
    W --> X[Use customer-managed key option in data at rest encryption when required]
    X --> Y[Data at Rest Encryption Using CMK]
    Y --> WW["Azure Storage Account Configuration"]    
    J --> Z(DP-6)
    Z --> AA[Use a secure key management process]
    AA --> AB[Key Management in Azure Key Vault]
    AB --> AC["DEPRECATED"]
graph LR  
    AC[Identity Management] -->AD(IM-1)
    AD --> AE[Use centralized identity and authentication system]
    AE --> AE1["Microsoft Entra ID"]
    AE --> AF[Microsoft Entra ID Authentication Required for Data Plane Access]
    AF --> AF1["Azure Storage Account Configuration"]
    AF1 --> AF2["Disable Allow Shared Key authorization"]
    AD --> AG[Local Authentication Methods for Data Plane Access]
    AG --> AG1["Azure Storage Account Configuration"]
    AG1 --> AG2["Don't use SFTP if you don't need it"]
    AC --> AH(IM-3)
    AH --> AI[Manage application identities securely and automatically]
    AI --> AJ[Managed Identities]
    AI --> AK[Service Principals]
    AK --> AK1["Rotate or regenerate service principal credentials"]
    AC --> AL(IM-7)
    AL --> AM[Restrict resource access based on conditions]
    AM --> AN[Microsoft Entra Conditional Access for Data Plane]
    AC --> AO(IM-8)
    AO --> AP[Restrict the exposure of credential and secrets]    
    AP --> AQ[Service Credential and Secrets Support Integration and Storage in Azure Key Vault]    
    AQ --> AK1
    click AK1 "https://github.com/Azure-Samples/KeyVault-Rotation-StorageAccountKey-PowerShell" "Open this in a new tab" _blank

graph LR
AR[Logging and threat detection] -->AS(LT-1)
    AS --> AT[Enable threat detection capabilities]
    AT --> AU[Microsoft Defender for Service / Product Offering]
    AU --> AU1["Microsoft Defender for Storage"]
    AR --> AV(LT-4)
    AV --> AW[Enable network logging for security investigation]
    AW --> AX[Azure Resource Logs]
    AX --> AX1["Azure Monitor"]
    AX --> AX2["Azure Activity Log"]
    click AU1 "https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-introduction" "Open this in a new tab" _blank
graph LR
    AY[Network Security] -->AZ(NS-2)
    AZ --> BA[Secure cloud services with network controls]
    BA --> BB["Azure Storage Account Configuration"]
    BB --> BB1[Disable Public Network Access]
    BB --> BB2[Allow trusted Microsoft services to access the storage account]
    BA --> BC[Azure Private Link]
    BC --> BC1["Azure Private Endpoint"]
    BA --> BD[Azure Network]
    BD --> BD1["Azure Service Endpoint"]
    BA --> BE["Network Security Perimeter"]

graph LR
    BD[Privileged Access] -->BE(PA-7)
    BE --> BF["Follow just enough administration(least privilege) principle"]
    BF --> BG[Azure RBAC for Data Plane]
    BG --> BG1["Azure RBAC"]
    BG1 --> BG2["Azure RBAC Roles"]
    BD --> BH(PA-8)
    BH --> BI[Choose approval process for third-party support]
    BI --> BJ[Customer Lockbox]
click BG2 "https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles/storage" "Open this in a new tab" _blank

Example of mixed strategies for data protection in Azure storage accounts

The following example illustrates how to implement mixed strategies for data protection in Azure storage accounts:

Conclusion

In conclusion, data threat modeling is an important process for identifying and addressing security vulnerabilities in Azure storage accounts. By identifying assets, threats, risks, and developing mitigation strategies, organizations can protect their data from security threats and ensure the security and integrity of their data assets. By following best practices and implementing security measures, organizations can prevent and detect data threats in Azure storage accounts and protect their data from security threats.

References

Tagging best practices in Azure

In this post, I will show you some best practices for tagging resources in Azure.

What are tags?

Tags are key-value pairs that you can assign to your Azure resources to organize and manage them more effectively. Tags allow you to categorize resources in different ways, such as by environment, owner, or cost center, and to apply policies and automation based on these categories.

If you don't know anything about tags, you can read the official documentation to learn more about them.

Why use tags?

There are several reasons to use tags:

  • Organization: Tags allow you to organize your resources in a way that makes sense for your organization. You can use tags to group resources by environment, project, or department, making it easier to manage and monitor them.

  • Cost management: Tags allow you to track and manage costs more effectively. You can use tags to identify resources that are part of a specific project or department, and to allocate costs accordingly.

  • Automation: Tags allow you to automate tasks based on resource categories. You can use tags to apply policies, trigger alerts, or enforce naming conventions, making it easier to manage your resources at scale.

Best practices for tagging resources in Azure

Here are some best practices for tagging resources in Azure:

  • Use consistent naming conventions: Define a set of standard tags that you will use across all your resources. This will make it easier to search for and manage resources, and to apply policies and automation consistently.

  • Apply tags at resource creation: Apply tags to resources when you create them, rather than adding them later. This will ensure that all resources are tagged correctly from the start, and will help you avoid missing or incorrect tags.

  • Use tags to track costs: Use tags to track costs by project, department, or environment. This will help you allocate costs more effectively, and will make it easier to identify resources that are not being used or are costing more than expected.

  • Define tags by hierarchy: Define tags in a hierarchy that makes sense for your organization, from more general at level subscription to more specific at resource group level.

  • Use inherited tags: Use inherited tags to apply tags to resources automatically based on their parent resources. This will help you ensure that all resources are tagged consistently, and will reduce the risk of missing or incorrect tags. Exist Azure Policy to enforce inherited tags for example, you can check all in Assign policy definitions for tag compliance

  • Don't use tags for policy filtering: If you use Azure Policy, it's highly recommended not to use tag filtering in your policy rules when the policy relates to security setting, when you use tags to filter, resources without tag appear Compliance. Azure Policy exemptions or Azure Policy exclusions are recommended.

  • Don't use tags for replace naming convention gaps: Tags are not a replacement for naming conventions. Use tags to categorize resources, and use naming conventions to identify resources uniquely.

  • Use tags for automation: Use tags to trigger automation tasks, such as scaling, backup, or monitoring. You can use tags to define policies that enforce specific actions based on resource categories.

  • Don't go crazy adding tags: Don't add too many tags to your resources. Keep it simple and use tags that are meaningful and useful. Too many tags can make it difficult to manage. You can begin with a small set of tags and expand as needed, for example: Minimum Suggested Tags

  • Not all Azure services support tags: Keep in mind that not all Azure services support tags. You can check in the Tag support for Azure resources to see which services support tags.

Conclusion

By using tags, you can organize and manage your resources more effectively, track and manage costs more efficiently, and automate tasks based on resource categories. I hope this post has given you a good introduction to tagging best practices in Azure and how you can use tags to optimize your cloud environment.

Restrict managed disks from being imported or exported

In this post, I will show you how to restrict managed disks from being imported or exported in Azure.

What are managed disks?

Azure Managed Disks are block-level storage volumes that are managed by Azure and used with Azure Virtual Machines. Managed Disks are designed for high availability and durability, and they provide a simple and scalable way to manage your storage.

If you don't know anything about Azue Managed Disks, grab a cup of coffee( it will take you a while), you can read the official documentation to learn more about them.

Why restrict managed disks from being imported or exported?

There are several reasons to restrict managed disks from being imported or exported:

  • Security: By restricting managed disks from being imported or exported, you can reduce the risk of unauthorized access to your data.
  • Compliance: By restricting managed disks from being imported or exported, you can help ensure that your organization complies with data protection regulations.

How to restrict managed disks from being imported or exported

At deployment time

An example with azcli:

Create a managed disk with public network access disabled
## Create a managed disk with public network access disabled
az disk create --resource-group myResourceGroup --name myDisk --size-gb 128 --location eastus --sku Standard_LRS --no-wait --public-network-access disabled 
Create a managed disk with public network access disabled and private endpoint enabled

Follow Azure CLI - Restrict import/export access for managed disks with Private Links

At Scale

If you want to restrict managed disks from being imported or exported, you can use Azure Policy to enforce this restriction. Azure Policy is a service in Azure that you can use to create, assign, and manage policies that enforce rules and effects over your resources. By using Azure Policy, you can ensure that your resources comply with your organization's standards and service-level agreements.

To restrict managed disks from being imported or exported using Azure Policy, you can use or create a policy definition that specifies the conditions under which managed disks can be imported or exported. You can then assign this policy definition to a scope, such as a management group, subscription, or resource group, to enforce the restriction across your resources.

In this case we have a Built-in policy definition that restricts managed disks from being imported or exported Configure disk access resources with private endpoints

Conclusion

In this post, I showed you how to restrict managed disks from being imported or exported in Azure. By restricting managed disks from being imported or exported, you can reduce the risk of unauthorized access to your data and help ensure that your organization complies with data protection regulations.

Curiosly, restrict managed disks from being imported or exported, it's not a compliance check in the Microsoft cloud security benchmark but it's a good practice to follow.

Securely connect Power BI to data sources with a VNet data gateway

In this post, I will show you how to securely connect Power BI to your Azure data services using a Virtual Network (VNet) data gateway.

What is a Virtual Network (VNet) data gateway?

The virtual network (VNet) data gateway helps you to connect from Microsoft Cloud services to your Azure data services within a VNet without the need of an on-premises data gateway. The VNet data gateway securely communicates with the data source, executes queries, and transmits results back to the service.

The Role of a VNet Data Gateway

A VNet data gateway acts as a bridge that allows for the secure flow of data between the Power Platform and external data sources that reside within a virtual network. This includes services such as SQL databases, file storage solutions, and other cloud-based resources. The gateway ensures that data can be transferred securely and reliably, without exposing the network to potential threats or breaches.

How It Works

graph LR
    User([User]) -->|Semantic Models| SM[Semantic Models]
    User -->|"Dataflows (Gen2)"| DF["Dataflows (Gen2)"]
    User -->|Paginated Reports| PR[Paginated Reports]
    SM --> PPVS[Power Platform VNET Service]
    DF --> PPVS
    PR --> PPVS
    PPVS --> MCVG[Managed Container
for VNet Gateway] MCVG -->|Interfaces with| SQLDB[(SQL Databases)] MCVG -->|Interfaces with| FS[(File Storage)] MCVG -->|Interfaces with| CS[(Cloud Services)] MCVG -.->|Secured by| SEC{{Security Features}} subgraph VNET_123 SQLDB FS CS SEC end classDef filled fill:#f96,stroke:#333,stroke-width:2px; classDef user fill:#bbf,stroke:#f66,stroke-width:2px,stroke-dasharray: 5, 5; class User user class SM,DF,PR,PPVS,MCVG,SQLDB,FS,CS,SEC filled

The process begins with a user leveraging Power Platform services like Semantic Models, Dataflows (Gen2), and Paginated Reports. These services are designed to handle various data-related tasks, from analysis to visualization. They connect to the Power Platform VNET Service, which is the heart of the operation, orchestrating the flow of data through the managed container for the VNet gateway.

This managed container is a secure environment specifically designed for the VNet gateway’s operations. It’s isolated from the public internet, ensuring that the data remains protected within the confines of the virtual network. Within this secure environment, the VNet gateway interfaces with the necessary external resources, such as SQL databases and cloud storage, all while maintaining strict security protocols symbolized by the padlock icon in our diagram.

If you need to connect to services on others VNets, you can use VNet peering to connect the VNets, and maybe access to on-premises resources using ExpressRoute or other VPN solutions.

The Benefits

By utilizing a VNet data gateway, organizations can enjoy several benefits:

  • Enhanced Security: The gateway provides a secure path for data, safeguarding sensitive information and complying with organizational security policies.
  • Network Isolation: The managed container and the virtual network setup ensure that the data does not traverse public internet spaces, reducing exposure to vulnerabilities.
  • Seamless Integration: The gateway facilitates smooth integration between Power Platform services and external data sources, enabling efficient data processing and analysis.

Getting Started

To set up a VNet data gateway, follow these steps:

Register Microsoft.PowerPlatform as a resource provider

Before you can create a VNet data gateway, you need to register the Microsoft.PowerPlatform resource provider. This can be done using the Azure portal or the Azure CLI.

az provider register --namespace Microsoft.PowerPlatform

Associate the subnet to Microsoft Power Platform

Create a VNet in your Azure subscription and a subnet where the VNet data gateway will be deployed. Next, you need to delegate subnet to service Microsoft.PowerPlatform/vnetaccesslinks.

Note

  • This subnet can't be shared with other services.
  • Five IP addresses are reserved in the subnet for basic functionality. You need to reserve additional IP addresses for the VNet data gateway, add more IPs for future gateways.
  • You need a role with the Microsoft.Network/virtualNetworks/subnets/join/action permission

This can be done using the Azure portal or the Azure CLI.

# Create a VNet and address prefix 10.0.0.0/24
az network vnet create --name MyVNet --resource-group MyResourceGroup --location eastus --address-prefixes 10.0.0.0/24


# Create a Netwrok Security Group
az network nsg create --name MyNsg --resource-group MyResourceGroup --location eastus

# Create a subnet with delegation to Microsoft.PowerPlatform/vnetaccesslinks and associate the NSG
az network vnet subnet create --name MySubnet --vnet-name MyVNet --resource-group MyResourceGroup --address-prefixes 10.0.0.1/27 --network-security-group MyNsg --delegations Microsoft.PowerPlatform/vnetaccesslinks

Create a VNet data gateway

Note

A Microsoft Power Platform User with with Microsoft.Network/virtualNetworks/subnets/join/action permission on the VNet is required. Network Contributor role is not necessary.

  1. Sign in to the Power BI homepage.
  2. In the top navigation bar, select the settings gear icon on the right.
  3. From the drop down, select the Manage connections and gateways page, in Resources and extensions.
  4. Select Create a virtual network data gateway..
  5. Select the license capacity, subscription, resource group, VNet and the Subnet. Only subnets that are delegated to Microsoft Power Platform are displayed in the drop-down list. VNET data gateways require a Power BI Premium capacity license (A4 SKU or higher or any P SKU) or Fabric license to be used (any SKU).
  6. By default, we provide a unique name for this data gateway, but you could optionally update it.
  7. Select Save. This VNet data gateway is now displayed in your Virtual network data gateways tab. A VNet data gateway is a managed gateway that could be used for controlling access to this resource for Power platform users.

Conclusion

The VNet data gateway is a powerful tool that enables secure data transfer between the Power Platform and external data sources residing within a virtual network. By leveraging this gateway, organizations can ensure that their data remains protected and compliant with security standards, all while facilitating seamless integration and analysis of data. If you are looking to enhance the security and reliability of your data connections, consider implementing a VNet data gateway in your environment.

FinOps for Azure: Optimizing Your Cloud Spend

In the cloud era, optimizing cloud spend has become a critical aspect of financial management. FinOps, a set of financial operations practices, empowers organizations to get the most out of their Azure investment. This blog post dives into the core principles of FinOps, explores the benefits it offers, and outlines practical strategies for implementing FinOps in your Azure environment.

Understanding the Cloud Cost Challenge

Traditional IT expenditure followed a capital expenditure (capex) model, where businesses purchased hardware and software upfront. Cloud computing introduces a paradigm shift with the operational expenditure (opex) model. Here, businesses pay for resources as they consume them, leading to variable and unpredictable costs.

FinOps tackles this challenge by providing a framework for managing cloud finances. It encompasses three key pillars:

  • People: Establish a FinOps team or designate individuals responsible for overseeing cloud costs. This team should possess a blend of cloud technical expertise, financial acumen, and business process knowledge.
  • Process: Define processes for budgeting, forecasting, and monitoring cloud expenses. This involves setting spending limits, creating chargeback models for different departments, and regularly reviewing cost reports.
  • Tools: Leverage Azure Cost Management, a suite of tools that provides granular insights into your Azure spending. It enables cost allocation by resource, service, department, or any other relevant dimension.

It's essential to adopt a FinOps mindset that encourages collaboration between finance, IT, and business teams to drive cost efficiency and value realization in the cloud.

It's important to note that FinOps is not just about cost-cutting; it's about optimizing cloud spending to align with business objectives and maximize ROI.

Azure Cost Management: Optimizing Your Azure Spending

Azure Cost Management empowers you to analyze your Azure spending patterns and identify cost-saving opportunities. Here's a glimpse into its key functionalities:

  • Cost Views: Generate comprehensive reports that categorize your Azure spending by various attributes like resource group, service, or department.
  • Cost Alerts: Set up proactive alerts to notify you when your spending exceeds predefined thresholds.
  • Reservations: Purchase reserved instances of frequently used Azure resources for significant upfront discounts.
  • Recommendations: Azure Cost Management analyzes your usage patterns and recommends potential cost-saving measures, such as rightsizing resources or leveraging spot instances.

The Power of Tags and Azure Policy

Tags are metadata labels that you can attach to your Azure resources to categorize and track them effectively. They play a pivotal role in FinOps by enabling you to:

  • Associate costs with specific departments, projects, or applications.
  • Identify unused or underutilized resources for potential cost savings.
  • Simplify cost allocation and chargeback processes.

Azure Policy helps enforce tagging standards and governance rules across your Azure environment. You can define policies that mandate specific tags for all resources, ensuring consistent cost allocation and data accuracy.

Benefits of Implementing FinOps

A well-defined FinOps strategy empowers you to:

  • Gain Visibility into Cloud Spending: Obtain a clear picture of your Azure expenditures, enabling informed budgeting and cost control decisions.
  • Optimize Cloud Costs: Identify and eliminate wasteful spending through cost-saving recommendations and proactive measures.
  • Improve Cloud Governance: Enforce tagging policies and spending limits to ensure responsible cloud resource utilization.
  • Align Cloud Spending with Business Value: Make data-driven decisions about cloud investments that support your business objectives.

Getting Started with FinOps in Azure

Implementing FinOps doesn't necessitate a complex overhaul. Here's a recommended approach:

  1. Establish a FinOps Team: Assemble a cross-functional team comprising representatives from finance, IT, and business departments.
  2. Set Clear Goals and Objectives: Define your FinOps goals, whether it's reducing costs by a specific percentage or improving budget forecasting accuracy.
  3. Leverage Azure Cost Management: Start by exploring Azure Cost Management to understand your current spending patterns.
  4. Implement Basic Tagging Standards: Enforce basic tagging policies to categorize your Azure resources for cost allocation purposes.
  5. Continuously Monitor and Refine: Regularly review your cloud cost reports and identify areas for further optimization.

By following these steps and embracing a FinOps culture, you can effectively manage your Azure expenses and derive maximum value from your cloud investment.

Toolchain for FinOps in Azure

To streamline your FinOps practices in Azure, consider leveraging the following tools:

graph LR
A[Financial Operations Practices] --> B{Cloud Spend Optimization}
B --> C{Cost Visibility}
B --> D{Cost Optimization}
B --> E{Governance}
C --> F{Azure Cost Management}
D --> G{Azure Advisor}
E --> H{Azure Policy}
F --> I{Cost Views}
F --> J{Cost Alerts}
G --> K{Cost Recommendations}
H --> L{Tag Enforcement}

This toolchain combines Azure Cost Management, Azure Advisor, and Azure Policy to provide a comprehensive suite of capabilities for managing your Azure spending effectively.

Highly recommended, you can check FinOps toolkit, it's a set of tools and best practices to help you implement FinOps in your organization, it includes tools for cost allocation, budgeting, and forecasting, as well as best practices for FinOps implementation.

Conclusion

FinOps is an essential practice for organizations leveraging Azure. It empowers you to make informed decisions about your cloud finances, optimize spending, and achieve your business goals. As an Azure Solutions Architect, I recommend that you establish a FinOps practice within your organization to unlock the full potential of Azure and achieve financial efficiency in the cloud.

This blog post provides a foundational understanding of FinOps in Azure.

By embracing FinOps, you can transform your cloud cost management practices and drive business success in the cloud era.

References

Reduce your attack surface in Snowflake when using from Azure

When it comes to data security, reducing your attack surface is a crucial step. This post will guide you on how to minimize your attack surface when using Snowflake with Azure or Power BI.

What is Snowflake?

Snowflake is a cloud-based data warehousing platform that allows you to store and analyze large amounts of data. It is known for its scalability, performance, and ease of use. Snowflake is popular among organizations that need to process large volumes of data quickly and efficiently.

What is an Attack Surface?

An attack surface refers to the number of possible ways an attacker can get into a system and potentially extract data. The larger the attack surface, the more opportunities there are for attackers. Therefore, reducing the attack surface is a key aspect of securing your systems.

How to Reduce Your Attack Surface in Snowflake:

  1. Use Azure Private Link Azure Private Link provides private connectivity from a virtual network to Snowflake, isolating your traffic from the public internet. It significantly reduces the attack surface by ensuring that traffic between Azure and Snowflake doesn't traverse over the public internet.

  2. Implement Network Policies Snowflake allows you to define network policies that restrict access based on IP addresses. By limiting access to trusted IP ranges, you can reduce the potential points of entry for an attacker.

  3. Enable Multi-Factor Authentication (MFA) MFA adds an extra layer of security by requiring users to provide at least two forms of identification before accessing the Snowflake account. This makes it harder for attackers to gain unauthorized access, even if they have your password.

In this blog post, we will show you how to reduce your attack surface when using Snowflake from Azure or Power BI by using Azure Private Link

Default Snowflake architecture

By default, Snowflake is accessible from the public internet. This means that anyone with the right credentials can access your Snowflake account from anywhere in the world, you can limit access to specific IP addresses, but this still exposes your Snowflake account to potential attackers.

This is a security risk because it exposes your Snowflake account to potential attackers.

The architecture of using Azure Private Link with Snowflake is as follows:

    graph TD
    A[Virtual Network] -->|Private Endpoint| B[Snowflake]
    B -->|Private Link Service| C[Private Link Resource]
    C -->|Private Connection| D[Virtual Network]
    D -->|Subnet| E[Private Endpoint]    

Architecture Components

Requirements

Before you can use Azure Private Link with Snowflake, you need to have the following requirements in place:

  • A Snowflake account with ACCOUNTADMIN privileges
  • Business Critical or higher Snowflake edition
  • An Azure subscription with a Resource Group and privileges to create:
    • Virtual Network
    • Subnet
    • Private Endpoint

Step-by-Step Guide

Note

Replace the placeholders with your actual values, that is a orientation guide.

Step 1: Retrieve Dedailts of your Snowflake Account
USE ROLE ACCOUNTADMIN;
select select SYSTEM$GET_PRIVATELINK_CONFIG();
Step 2: Create a Virtual Network

A virtual network is a private network that allows you to securely connect your resources in Azure. To create a virtual network with azcli, follow these steps:

az network vnet create \
  --name myVnet \
  --resource-group myResourceGroup \
  --address-prefix
  --subnet-name mySubnet \
  --subnet-prefix
  --enable-private-endpoint-network-policies true
Step 3: Create a Private Endpoint

The first step is to create a private endpoint in Azure. A private endpoint is a network interface that connects your virtual network to the Snowflake service. This allows you to access Snowflake using a private IP address, rather than a public one.

To create a private endpoint with azcli, follow these steps:

az network private-endpoint create \
  --name mySnowflakeEndpoint \
  --resource-group myResourceGroup \
  --vnet-name myVnet \
  --subnet mySubnet \
  --private-connection-resource-id /subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.Network/privateLinkServices/<Snowflake-service-name> \
  --connection-name mySnowflakeConnection

Check the status of the private endpoint:

az network private-endpoint show \
  --name mySnowflakeEndpoint \
  --resource-group myResourceGroup
Step 4: Authorize the Private Endpoint

The next step is to authorize the private endpoint to access the Snowflake service.

Retrieve the Resource ID of the private endpoint:

az network private-endpoint show \
  --name mySnowflakeEndpoint \
  --resource-group myResourceGroup

Create a temporary access token that Snowflake can use to authorize the private endpoint:

az account get-access-token --subscription <subscription-id>

Authorize the private endpoint in Snowflake:

USE ROLE ACCOUNTADMIN;
select SYSTEM$AUTHORIZE_PRIVATELINK('<resource-id>', '<access-token>');

Step 5: Block Public Access

To further reduce your attack surface, you can block public access to your Snowflake account. This ensures that all traffic to and from Snowflake goes through the private endpoint, reducing the risk of unauthorized access.

To block public access to your Snowflake account, you need to use Network Policy, follow these steps:

USE ROLE ACCOUNTADMIN;
CREATE NETWORK RULE allow_access_rule
  MODE = INGRESS
  TYPE = IPV4
  VALUE_LIST = ('192.168.1.99/24');

CREATE NETWORK RULE block_access_rule
  MODE = INGRESS
  TYPE = IPV4
  VALUE_LIST = ('0.0.0.0/0');

CREATE NETWORK POLICY public_network_policy
  ALLOWED_NETWORK_RULE_LIST = ('allow_access_rule')
  BLOCKED_NETWORK_RULE_LIST=('block_access_rule');

It's highly recommended to follow the best practices for network policies in Snowflake. You can find more information here: https://docs.snowflake.com/en/user-guide/network-policies#best-practices

Step 6: Test the Connection

To test the connection between your virtual network and Snowflake, you can use the SnowSQL client.

snowsql -a <account_name> -u <username> -r <role> -w <warehouse> -d <database> -s <schema>

Internal Stages with Azure Blob Private Endpoints

If you are using Azure Blob Storage as an internal stage in Snowflake, you can also use Azure Private Link to secure the connection between Snowflake and Azure Blob Storage.

It's recommended to use Azure Blob Storage with Private Endpoints to ensure that your data is secure and that you are reducing your attack surface, you can check the following documentation for more information: Azure Private Endpoints for Internal Stages to learn how to configure Azure Blob Storage with Private Endpoints in Snowflake.

Conclusion

Reducing your attack surface is a critical aspect of securing your systems. By using Azure Private Link with Snowflake, you can significantly reduce the risk of unauthorized access and data breaches. Follow the steps outlined in this blog post to set up Azure Private Link with Snowflake and start securing your data today.

References

Azure Container Registry: Artifact Cache

Azure Container Registry is a managed, private Docker registry service provided by Microsoft. It allows you to build, store, and manage container images and artifacts in a secure environment.

What is Artifact Caching?

Artifact Cache is a feature in Azure Container Registry that allows users to cache container images in a private container registry. It is available in Basic, Standard, and Premium service tiers.

Benefits of Artifact Cache

  • Reliable pull operations: Faster pulls of container images are achievable by caching the container images in ACR.
  • Private networks: Cached registries are available on private networks.
  • Ensuring upstream content is delivered: Artifact Cache allows users to pull images from the local ACR instead of the upstream registry.

Limitations

Cache will only occur after at least one image pull is complete on the available container image

How to Use Artifact Cache in Azure Container Registry without credential?

Let's take a look at how you can implement artifact caching in Azure Container Registry.

Step 1: Create a Cache Rule

The first step is to create a cache rule in your Azure Container Registry. This rule specifies the source image that should be cached and the target image that will be stored in the cache.

az acr cache create -r MyRegistry -n MyRule -s docker.io/library/ubuntu -t ubuntu

Check the cache rule:

az acr cache show -r MyRegistry -n MyRule

Step 2: Pull the Image

Next, you need to pull the image from the source registry to the cache. This will download the image and store it in the cache for future use.

docker pull myregistry.azurecr.io/hello-world:latest

Step 3: Clean up the resources

Finally, you can clean up the resources by deleting the cache rule.

az acr cache delete -r MyRegistry -n MyRule

If you need to check other rules, you can use the following command:

az acr cache list -r MyRegistry

Conclusion

Azure Container Registry's Artifact Cache feature provides a convenient way to cache container images in a private registry, improving pull performance and reducing network traffic. By following the steps outlined in this article, you can easily set up and use artifact caching in your Azure Container Registry.

If you need to use the cache with authentication, you can use the following article: Enable Artifact Cache with authentication.

For more detailed information, please visit the official tutorial on the Microsoft Azure website.

References

How to create a Management Group diagram with draw.io

I nedd to create a diagram of the Management Groups in Azure, and I remembered a project that did something similar but with PowerShell: https://github.com/PowerShellToday/new-mgmgroupdiagram.

Export your Management Group structure from Azure Portal or ask for it

If you can access the Azure Portal, you can export the Management Group structure to a CSV file. To do this, follow these steps:

  1. Go to the Azure portal.
  2. Navigate to Management groups.
  3. Click on Export.
  4. Save the CSV file to your local machine.

If you don't have access to the Azure Portal, you can ask your Azure administrator to export the Management Group structure for you.

The file has the following columns:

  • id: The unique identifier of the Management Group or subscription.
  • displayName: The name of the Management Group or subscription.
  • itemType: The type of the item (Management Group or subscription).
  • path: The path to the management or subscription group, its parent.
  • accessLevel: Your access level.
  • childSubscriptionCount: The number of child subscriptions at this level.
  • totalSubscriptionCount: The total number of subscriptions.

Create a CSV to be imported into draw.io

  1. Import the CSV file to excel, rename the sheet to "Export_Portal"
  2. Create a second sheet with the following columns:
    • id: reference to the id in the first sheet
    • displayName: reference to the displayName in the first sheet
    • itemType: reference to the itemType in the first sheet
    • ParentId: Use the following formula to get the parent of the current item:
      =IF(ISERROR(FIND(","; Export_Portal!D2)); Export_Portal!D2; TRIM(RIGHT(SUBSTITUTE(Export_Portal!D2; ","; REPT(" "; LEN(Export_Portal!D2))); LEN(Export_Portal!D2))))
      
  3. Export the second sheet to a CSV file.

Import the CSV file into draw.io

  1. Go to draw.io and create a new diagram.
  2. Click on Arrange > Insert > Advanced > CSV.
  3. Insert the header for the columns: id, displayName, itemType, Parent:

        #label: %displayName%
        #stylename: itemType
        #styles: {"Management Group": "label;image=img/lib/azure2/general/Management_Groups.svg;whiteSpace=wrap;html=1;rounded=1; fillColor=%fill%;strokeColor=#6c8ebf;fillColor=#dae8fc;points=[[0.5,0,0,0,0],[0.5,1,0,0,0]];",\
        #"Subscription": "label;image=img/lib/azure2/general/Subscriptions.svg;whiteSpace=wrap;html=1;rounded=1; fillColor=%fill%;strokeColor=#d6b656;fillColor=#fff2cc;points=[[0.5,0,0,0,0],[0.5,1,0,0,0]];imageWidth=26;"}
        #
        #
        #namespace: csvimport-
        #
        #connect: {"from": "ParentId", "to": "displayName", "invert": true, "style": "curved=1;endArrow=blockThin;endFill=1;fontSize=11;edgeStyle=orthogonalEdgeStyle;"}
        #
        ## Node width and height, and padding for autosize
        #width: auto
        #height: auto
        #padding: -12
        #
        ## ignore: id,image,fill,stroke,refs,manager
        #
        ## Column to be renamed to link attribute (used as link).
        ## link: url
        #
        ## Spacing between nodes, heirarchical levels and parallel connections.
        #nodespacing: 40
        #levelspacing: 100
        #edgespacing: 40
        #
        ## layout: auto
        #layout: verticaltree
        #
        ## ---- CSV below this line. First line are column names. ----
    
    4. Paste the content of the CSV file and click on Import.

You should see a diagram with the Management Groups and Subscriptions.

For example:

This is the common structure for the Management Groups in the Enterprise Scale Landing Zone, now Accelerator Landing Zone:

    graph TD
        A[Root Management Group] --> B[Intermediary-Management-Group]
        B --> C[Decommissioned]
        B --> D[Landing Zones]
        B --> E[Platform]
        B --> F[Sandboxes]
        D --> G[Corp]
        D --> H[Online]
        E --> I[Connectivity]
        E --> J[Identity]
        E --> K[Management]        

And this is the CSV file to import into draw.io:

#label: %displayName%
#stylename: itemType
#styles: {"Management Group": "label;image=img/lib/azure2/general/Management_Groups.svg;whiteSpace=wrap;html=1;rounded=1; fillColor=%fill%;strokeColor=#6c8ebf;fillColor=#dae8fc;points=[[0.5,0,0,0,0],[0.5,1,0,0,0]];",\
#"Subscription": "label;image=img/lib/azure2/general/Subscriptions.svg;whiteSpace=wrap;html=1;rounded=1; fillColor=%fill%;strokeColor=#d6b656;fillColor=#fff2cc;points=[[0.5,0,0,0,0],[0.5,1,0,0,0]];imageWidth=26;"}
#
#
#namespace: csvimport-
#
#connect: {"from": "ParentId", "to": "displayName", "invert": true, "style": "curved=1;endArrow=blockThin;endFill=1;fontSize=11;edgeStyle=orthogonalEdgeStyle;"}
#
## Node width and height, and padding for autosize
#width: auto
#height: auto
#padding: -12
#
## ignore: id,image,fill,stroke,refs,manager
#
## Column to be renamed to link attribute (used as link).
## link: url
#
## Spacing between nodes, heirarchical levels and parallel connections.
#nodespacing: 40
#levelspacing: 100
#edgespacing: 40
#
## layout: auto
#layout: verticaltree
#
## ---- CSV below this line. First line are column names. ----
id,displayName,itemType,ParentId
1,Tenant Root Group,Management Group,
2,Intermediary Management Group,Management Group,Tenant Root Group
3,Decommissioned,Management Group,Intermediary Management Group
4,Landing Zones,Management Group,Intermediary Management Group
5,Platform,Management Group,Intermediary Management Group
6,Sandboxes,Management Group,Landing Zones
7,Corp,Management Group,Landing Zones
8,Online,Management Group,Landing Zones
9,Connectivity,Management Group,Platform
10,Identity,Management Group,Platform
11,Management,Management Group,Platform
12,subcr-1,Subscription,Decommissioned
13,subcr-2,Subscription,Sandboxes
14,subcr-3,Subscription,Corp
15,subcr-4,Subscription,Online
16,subcr-5,Subscription,Connectivity
17,subcr-6,Subscription,Identity
18,subcr-7,Subscription,Management

Make your diagram animated and interactive

You can make your diagram animated and interactive by following these steps:

  1. File > Export as > URL
  2. Add p=ex& after the first ? in the URL.

For example, the URL should look like this:

https://viewer.diagrams.net/?p=ex&tags=%7B%7D&highlight=0000ff&layers=1&nav=1&title=MGs.drawio#R7Zxbc5s4FMc%2FjR%2BbAQkEPK7dJHWn3XbW6exMX3ZkkLFakDxCvvXTr7jFxrZi1k0Wg5lxYnR0QfqfHxqOBB7AUbx5FHgx%2F8wDEg2AEWwG8P0AAGAgQ32llm1uMU0X5pZQ0KCw7QwT%2BosUxqJiuKQBSSoFJeeRpIuq0eeMEV9WbFgIvq4Wm%2FGoetYFDsmRYeLj6Nj6Nw3kvBwG8nYZHwgN58WpXeDkGTEuCxcjSeY44Os9E7wfwJHgXOZH8WZEolS9Upe83oMm97ljgjB5osK3hIgv0x%2BpJsCI8FQ5Jis0AHZAk0WEt3%2FimKhU3k7Z7HBLxq7x42mcwOHy%2B0dn9PCRvIOFz%2FYrZqWfCMMsPcFf6TCA8Sj4clE0KEn8tF0UBT9jpoSKCTso9TXr%2Fjgo%2B5YNKcI%2BmStHEbHvvKoMidyWLioGN6Rx7ksah0qFiE7Vf%2FxrKUgqSkgYEVjVfNh15Z%2BsI8ldsgpV9fVcdXmyUOdWbawVzso2l3Eqm6kOVVEWkKBIpSjRKBrxiIusF3aaTNWEw0QK%2FpPssiDyXTKdqZxqFRhg4s58ZV9wymQ2VnuoPsadcsrIqPzZqsLoOc88zFOftFOpfCsiJNloIdnX8pHwmEixVUWKCpYD7goituU1nCfXO%2FqB4ea2%2BR74dmHDhc%2FC57Z3TKqDwoFlco%2FS16fW0lA7ZpKImAQUZ0PXsXkJwS9cED3WDWINnQOsTegec%2B20g2tbw%2FV74vM4pklCOVMuvRjimtdHD3RzQJezcjlJ2%2BiIZtNrB81IQ%2FMnzALKFCLGd4Vz0uPcYZyhZ57nuSWzs6Ph%2BWuE5YyLuAe5wyB7rn0eZKsdILsakCdqYp7yze9MySfn9p7bBu8nAKxwCx3rmFvUDm49DbcjLn4jsuuRvTJkLWieRxa0AlnL0CD7hUWUkR7azkBre955aGE7oDW182y2FE9XVG4vR7d6u9wz2xyzjuN05d7AAhpmx4ESo%2Be1E7wexmBtnmN1G2%2F7TPbEtp1Y00Q1wq92rH9Zul23ZDn1xTvzFK%2BT5TTxBV1IytkpUk9tbLwxr%2FtdentWAzRFNjpmdTabAf8VWc0GXz7HANAr3dPWmG5bEofpdtZydsEl7B6slvXYXgu2wKrOuZYHWsutbg8t5xZewu1utaxH9lqQPVzyajOyum2yHFnrEmT3V8t6aK8FWmRYnYFWtyWWQ2tfAu3xalmP7rWg63hOZ9DV7Yrl6KJL0K0umvXYXgu2nmd0BVtbtzOWY%2Btcgu3h2lkP7rWAa5qu2yS5Ry9ZnGayGNcKR0tSog2qjPhLsXp2IGHBH%2BmrKCo5jbj%2F82lOWW5%2BSB2ZF5pxJovXX8ysUhCSSdEgF3LOQ85wdL%2BzFoKn5V6WW3WML4VPXhhS%2BYKHxCIk8qWC1mkHChJhdQ9T7cl%2FcEZN5UH3lLfqKm83qjy8YeVRo8pbN6y806jydveUR3WVdxtVHt2w8l6jyju3q3z%2BFF5jyrvdU96pq7wmLPiflPduWHnQpPLlrwPcpPKwUeU7GMPadZVvNIZFHYxh3brKNxrDog7GsF5d5RuNYVH3YtjylY3zyjcaw6LuxbDlewfnlW80hkXdi2HLp%2BfPK99oDIu6F8Naddfn7TeKYVVy90tVWd7eD37B%2B38B#%7B%22pageId%22%3A%22UGUHswWqf16rUITyRAQM%22%7D

You can check it here

References

Moving Management Groups and Subscriptions

Managing your Azure resources efficiently often involves moving management groups and subscriptions. Here's a brief guide on how to do it:

Moving Management Groups

To move a management group, you need to have the necessary permissions. You must be an owner of the target parent management group and have Management Group Contributor role at the group you want to move.

Here's the step-by-step process:

  1. Navigate to the Azure portal.
  2. Go to Management groups.
  3. Select the management group you want to move.
  4. Click Details.
  5. Under Parent group, click Change.
  6. Choose the new parent group from the list and click Save.

Remember, moving a management group will also move all its child resources including other management groups and subscriptions.

Moving Subscriptions

You can move a subscription from one management group to another or within the same management group. To do this, you must have the Owner or Contributor role at the target management group and Owner role at the subscription level.

Follow these steps:

  1. Go to the Azure portal.
  2. Navigate to Management groups.
  3. Select the management group where the subscription currently resides.
  4. Click on Subscriptions.
  5. Find the subscription you want to move and select ..." (More options).
  6. Click Change parent.
  7. In the pop-up window, select the new parent management group and click Save.

Note

Moving subscriptions could affect the resources if there are policies or permissions applied at the management group level. It's important to understand the implications before making the move. Also, keep in mind that you cannot move the Root management group or rename it.

In conclusion, moving management groups and subscriptions allows for better organization and management of your Azure resources. However, it should be done carefully considering the impact on resources and compliance with assigned policies.