Skip to content

Blog

How to check if a role permission is good or bad in Azure RBAC

Do you need to check if a role permission is good or bad or just don't know what actions a provider has in Azure RBAC?

Get-AzProviderOperation * is your friend and you can always export everything to csv:

Get-AzProviderOperation | select Operation , OperationName , ProviderNamespace , ResourceName , Description , IsDataAction | Export-csv AzProviderOperation.csv

This command will give you a list of all the operations that you can perform on Azure resources, including the operation name, provider namespace, resource name, description, and whether it is a data action or not. You can use this information to check if a role permission is good or bad, or to find out what actions a provider has in Azure RBAC.

Script to check if a role permission is good or bad on tf files

You can use the following script to check if a role permission is good or bad on tf files:

<#
.SYNOPSIS
Script to check if a role permission is good or bad in Azure RBAC using Terraform files.

.DESCRIPTION
This script downloads Azure provider operations to a CSV file, reads the CSV file, extracts text from Terraform (.tf and .tfvars) files, and compares the extracted text with the CSV data to find mismatches.

.PARAMETER csvFilePath
The path to the CSV file where Azure provider operations will be downloaded.

.PARAMETER tfFolderPath
The path to the folder containing Terraform (.tf and .tfvars) files.

.PARAMETER DebugMode
Switch to enable debug mode for detailed output.

.EXAMPLE
.\Check-RBAC.ps1 -csvFilePath ".\petete.csv" -tfFolderPath ".\"

.EXAMPLE
.\Check-RBAC.ps1 -csvFilePath ".\petete.csv" -tfFolderPath ".\" -DebugMode

.NOTES
For more information, refer to the following resources:
- Azure RBAC Documentation: https://docs.microsoft.com/en-us/azure/role-based-access-control/
- Get-AzProviderOperation Cmdlet: https://docs.microsoft.com/en-us/powershell/module/az.resources/get-azprovideroperation
- Export-Csv Cmdlet: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/export-csv
#>

param(
    [string]$csvFilePath = ".\petete.csv",
    [string]$tfFolderPath = ".\",
    [switch]$DebugMode
)

# Download petete.csv using Get-AzProviderOperation
function Download-CSV {
    param(
        [string]$filename,
        [switch]$DebugMode
    )
    if ($DebugMode) { Write-Host "Downloading petete.csv using Get-AzProviderOperation" }
    Get-AzProviderOperation | select Operation, OperationName, ProviderNamespace, ResourceName, Description, IsDataAction | Export-Csv -Path $filename -NoTypeInformation
    if ($DebugMode) { Write-Host "CSV file downloaded: $filename" }
}

# Function to read the CSV file
function Read-CSV {
    param(
        [string]$filename,
        [switch]$DebugMode
    )
    if ($DebugMode) { Write-Host "Reading CSV file: $filename" }
    $csv = Import-Csv -Path $filename
    $csvData = $csv | ForEach-Object {
        [PSCustomObject]@{
            Provider = $_.Operation.Split('/')[0].Trim()
            Operation = $_.Operation
            OperationName = $_.OperationName
            ProviderNamespace = $_.ProviderNamespace
            ResourceName = $_.ResourceName
            Description = $_.Description
            IsDataAction = $_.IsDataAction
        }
    }
    if ($DebugMode) { Write-Host "Data read from CSV:"; $csvData | Format-Table -AutoSize }
    return $csvData
}

# Function to extract text from the Terraform files
function Extract-Text-From-TF {
    param(
        [string]$folderPath,
        [switch]$DebugMode
    )
    if ($DebugMode) { Write-Host "Reading TF and TFVARS files in folder: $folderPath" }
    $tfTexts = @()
    $files = Get-ChildItem -Path $folderPath -Filter *.tf,*.tfvars
    foreach ($file in $files) {
        $content = Get-Content -Path $file.FullName
        $tfTexts += $content | Select-String -Pattern '"Microsoft\.[^"]*"' -AllMatches | ForEach-Object { $_.Matches.Value.Trim('"').Trim() }
        $tfTexts += $content | Select-String -Pattern '"\*"' -AllMatches | ForEach-Object { $_.Matches.Value.Trim('"').Trim() }
        $tfTexts += $content | Select-String -Pattern '^\s*\*/' -AllMatches | ForEach-Object { $_.Matches.Value.Trim() }
    }
    if ($DebugMode) { Write-Host "Texts extracted from TF and TFVARS files:"; $tfTexts | Format-Table -AutoSize }
    return $tfTexts
}

# Function to compare extracted text with CSV data
function Compare-Text-With-CSV {
    param(
        [array]$csvData,
        [array]$tfTexts,
        [switch]$DebugMode
    )
    $mismatches = @()
    foreach ($tfText in $tfTexts) {
        if ($tfText -eq "*" -or $tfText -match '^\*/') {
            continue
        }
        $tfTextPattern = $tfText -replace '\*', '.*'
        if (-not ($csvData | Where-Object { $_.Operation -match "^$tfTextPattern$" })) {
            $mismatches += $tfText
        }
    }
    if ($DebugMode) { Write-Host "Mismatches found:"; $mismatches | Format-Table -AutoSize }
    return $mismatches
}

# Main script execution
Download-CSV -filename $csvFilePath -DebugMode:$DebugMode
$csvData = Read-CSV -filename $csvFilePath -DebugMode:$DebugMode
$tfTexts = Extract-Text-From-TF -folderPath $tfFolderPath -DebugMode:$DebugMode
$mismatches = Compare-Text-With-CSV -csvData $csvData -tfTexts $tfTexts -DebugMode:$DebugMode

if ($mismatches.Count -eq 0) {
    Write-Host "All extracted texts match the CSV data."
} else {
    Write-Host "Mismatches found:"
    $mismatches | Format-Table -AutoSize
}

This script downloads Azure provider operations to a CSV file, reads the CSV file, extracts text from Terraform files, and compares the extracted text with the CSV data to find mismatches. You can use this script to check if a role permission is good or bad in Azure RBAC using Terraform files.

I hope this post has given you a good introduction to how you can check if a role permission is good or bad in Azure RBAC and how you can use Terraform files to automate this process.

Happy coding! 🚀

terraform import block

Sometimes you need to import existing infrastructure into Terraform. This is useful when you have existing resources that you want to manage with Terraform, or when you want to migrate from another tool to Terraform.

Other times, you may need to import resources that were created outside of Terraform, such as manually created resources or resources created by another tool. For example:

"Error: unexpected status 409 (409 Conflict) with error: RoleDefinitionWithSameNameExists: A custom role with the same name already exists in this directory. Use a different name"

In my case, I had to import a custom role that was created outside of Terraform. Here's how I did it:

  1. Create a new Terraform configuration file for the resource you want to import. In my case, I created a new file called custom_role.tf with the following content:
resource "azurerm_role_definition" "custom_role" {
  name        = "CustomRole"
  scope       = "/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000"
  permissions {
    actions     = [
      "Microsoft.Storage/storageAccounts/listKeys/action",
      "Microsoft.Storage/storageAccounts/read"
    ]

    data_actions = []

    not_data_actions = []
  }
  assignable_scopes = [
    "/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000"
  ]
}
  1. Add a import block to the configuration file with the resource type and name you want to import. In my case, I added the following block to the custom_role.tf file:
import {
  to = azurerm_role_definition.custom_role
  id = "/providers/Microsoft.Authorization/roleDefinitions/11111111-1111-1111-1111-111111111111|/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000"

}
  1. Run the terraformplan` command to see the changes that Terraform will make to the resource. In my case, the output looked like this:
.
.
.
Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
  ~ update in-place
.
.
.
  1. Run the terraform apply command to import the resource into Terraform. In my case, the output looked like this after a long 9 minutes:
...
Apply complete! Resources: 1 imported, 0 added, 1 changed, 0 destroyed.
  1. Verify that the resource was imported successfully by running the terraform show command. In my case, the output looked like this:
terraform show

You can use the terraform import command to import existing infrastructure into Terraform too but I prefer to use the import block because it's more readable and easier to manage.

With terraform import the command would look like this:

terraform import azurerm_role_definition.custom_role "/providers/Microsoft.Authorization/roleDefinitions/11111111-1111-1111-1111-111111111111|/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000"

Conclusion

That's it! You've successfully imported an existing resource into Terraform. Now you can manage it with Terraform just like any other resource.

Happy coding! 🚀

Microsoft Entra ID Protection

Microsoft Entra offers a comprehensive set of security features to protect your organization's data and resources. One of these features is ID Protection, which helps you secure your users' identities and prevent unauthorized access to your organization's data. Here are some key benefits of using ID Protection in Microsoft Entra:

  • Multi-factor authentication (MFA): ID Protection enables you to enforce multi-factor authentication for all users in your organization. This adds an extra layer of security to your users' accounts and helps prevent unauthorized access.

  • Conditional access policies: With ID Protection, you can create conditional access policies that define the conditions under which users can access your organization's resources. For example, you can require users to use multi-factor authentication when accessing sensitive data or restrict access to certain applications based on the user's location.

  • Risk-based policies: ID Protection uses advanced machine learning algorithms to detect suspicious activities and risky sign-in attempts. You can create risk-based policies that automatically block or allow access based on the risk level associated with the sign-in attempt.

  • Identity protection reports: ID Protection provides detailed reports and insights into your organization's identity security posture. You can use these reports to identify security risks, monitor user activity, and take proactive measures to protect your organization's data.

By using ID Protection in Microsoft Entra, you can enhance the security of your organization's data and resources and protect your users' identities from cyber threats. If you want to learn more about ID Protection and other security features in Microsoft Entra, contact us today!

I hope this helps!

Microsoft Entra Attribute Duplicate Attribute Resiliency

Microsoft Entra Attribute Duplicate Attribute Resiliency feature is also being rolled out as the default behavior of Microsoft Entra ID. This will reduce the number of synchronization errors seen by Microsoft Entra Connect (as well as other sync clients) by making Microsoft Entra ID more resilient in the way it handles duplicated ProxyAddresses and UserPrincipalName attributes present in on premises AD environments. This feature does not fix the duplication errors. So the data still needs to be fixed. But it allows provisioning of new objects which are otherwise blocked from being provisioned due to duplicated values in Microsoft Entra ID. This will also reduce the number of synchronization errors returned to the synchronization client. If this feature is enabled for your Tenant, you will not see the InvalidSoftMatch synchronization errors seen during provisioning of new objects.

Behavior with Duplicate Attribute Resiliency

graph TD
    A[Start] --> B[Provision or Update Object]
    B --> C{Duplicate Attribute?}
    C -- Yes --> D[Quarantine Duplicate Attribute]
    D --> E{Is Attribute Required?}
    E -- Yes --> F[Assign Placeholder Value]
    F --> G[Send Error Report Email]
    E -- No --> H[Proceed with Object Creation/Update]
    H --> G
    G --> I[Export Succeeds]
    I --> J[Sync Client Does Not Log Error]
    J --> K[Sync Client Does Not Retry Operation]
    K --> L[Background Timer Task Every Hour]
    L --> M[Check for Resolved Conflicts]
    M --> N[Remove Attributes from Quarantine]
    C -- No --> H

Differences between B2B Direct Connect and B2B Collaboration in English

Microsoft Entra offers two ways to collaborate with external users: B2B Direct Connect and B2B Collaboration. Both features allow organizations to share resources with external users while maintaining control over access and security. However, they differ in functionality, access, and integration. Here is a comparison between B2B Direct Connect and B2B Collaboration:

Feature B2B Direct Connect B2B Collaboration
Definition Mutual trust relationship between two Microsoft Entra organizations Invite external users to access resources using their own credentials
Functionality Seamless collaboration using origin credentials and shared channels in Teams External users receive an invitation and access resources after authentication
Applications Shared channels in Microsoft Teams Wide range of applications and services within the Microsoft ecosystem
Access Single sign-on (SSO) with origin credentials Authentication each time resources are accessed, unless direct federation is set up
Integration Deep and continuous integration between two organizations Flexible way to invite and manage external users

I hope this helps!

Microsoft Defender for Storage

Microsoft Defender for Storage is part of the Microsoft Defender for Cloud suite of security solutions.

Introduction

Microsoft Defender for Storage is a cloud-native security solution that provides advanced threat protection for your Azure Storage accounts.

Microsoft Defender for Storage provides comprehensive security by analyzing the data plane and control plane telemetry generated by Azure Blob Storage, Azure Files, and Azure Data Lake Storage services. It uses advanced threat detection capabilities powered by Microsoft Threat Intelligence, Microsoft Defender Antivirus, and Sensitive Data Discovery to help you discover and mitigate potential threats.

Defender for Storage includes:

  • Activity Monitoring
  • Sensitive data threat detection (new plan only)
  • Malware Scanning (new plan only)

How it works

Microsoft Defender for Storage uses advanced threat detection capabilities powered by Microsoft Threat Intelligence, Microsoft Defender Antivirus, and Sensitive Data Discovery to help you discover and mitigate potential threats.

Activity Monitoring

Activity Monitoring provides insights into the operations performed on your storage accounts. It helps you understand the access patterns and operations performed on your storage accounts, and provides insights into the data plane and control plane activities.

Sensitive data threat detection

Sensitive data threat detection helps you discover and protect sensitive data stored in your storage accounts. It uses advanced machine learning models to detect sensitive data patterns and provides recommendations to help you protect your sensitive data.

Malware Scanning

Malware Scanning helps you detect and mitigate malware threats in your storage accounts. It uses advanced threat detection capabilities powered by Microsoft Defender Antivirus to scan your storage accounts for malware threats and provides recommendations to help you mitigate these threats.

Pricing

The pricing for Microsoft Defender for Storage is as follows:

Resource Type Resource Price
Storage Microsoft Defender for Storage €9 per storage account/month6
Storage Malware Scanning (add-on to Defender for Storage) €0.135/GB of data scanned

For more information about pricing, see the Microsoft Defender for Cloud pricing.

Conclusion

Microsoft Defender for Storage is a cloud-native security solution that provides advanced threat protection for your Azure Storage accounts. It uses advanced threat detection capabilities powered by Microsoft Threat Intelligence, Microsoft Defender Antivirus, and Sensitive Data Discovery to help you discover and mitigate potential threats.

For more information about Microsoft Defender for Storage, see the Overview of Microsoft Defender for Storage

markmap

markmap is a visualisation tool that allows you to create mindmaps from markdown files. It is based on the mermaid library and can be used to create a visual representation of a markdown file.

Installation in mkdocs

To install markmap in mkdocs, you need install the plugin using pip:

pip install mkdocs-markmap

Then, you need to add the following lines to your mkdocs.yml file:

plugins:
  - markmap

Usage

To use markmap, you need to add the following code block to your markdown file:

```markmap  
# Root

## Branch 1

* Branchlet 1a
* Branchlet 1b

## Branch 2

* Branchlet 2a
* Branchlet 2b
```

And this will generate the following mindmap:

alt text

That is for the future, because in my mkdocs not work as expected:

# Root

## Branch 1

* Branchlet 1a
* Branchlet 1b

## Branch 2

* Branchlet 2a
* Branchlet 2b

Visual Studio Code Extension

There is also a Visual Studio Code extension that allows you to create mindmaps from markdown files. You can install it from the Visual Studio Code marketplace.

    Name: Markdown Preview Markmap Support
    Id: phoihos.markdown-markmap
    Description: Visualize Markdown as Mindmap (A.K.A Markmap) to VS Code's built-in markdown preview
    Version: 1.4.6
    Publisher: phoihos
    VS Marketplace Link: https://marketplace.visualstudio.com/items?itemName=phoihos.markdown-markmap
VS Marketplace Link

Conclusion

I don't like too much this plugin because it not work as expected in my mkdocs but it's a good tool for documentation.

References

When to use service principals

In Azure, there are two ways to authenticate your applications and services: service principals and managed identities.

Basically, don't use service principals if you can use managed identities. Managed identities are a more secure way to authenticate your applications and services in Azure. However, there are some scenarios where you need to use service principals instead of managed identities. Here are some scenarios where you should use service principals:

flowchart TD
    A[When to use service principals] --> B{Is your application or service running on-premises?}
    B --> |Yes| C{Is Azure Arc supported?}
    C --> |Yes| D[Use Managed Identity]
    C --> |No| E[Use Service Principal]
    B --> |No| F{Do the resources or applications support managed identities?}
    F --> |No| G[Use Service Principal]
    F --> |Yes| H[Use Managed Identity]

Conclusion

In conclusion, you should use managed identities whenever possible. However, if your application or service is running on-premises or if the resources or applications do not support managed identities, you should use service principals instead.

Microsoft Entra Conditional Access

In this post, I will show you how to configure Conditional Access in Microsoft Entra.

What is Conditional Access?

Conditional Access is a feature of Microsoft Entra that allows you to control access to your organization's resources based on specific conditions. With Conditional Access, you can enforce policies that require users to meet certain criteria before they can access resources, such as multi-factor authentication, device compliance, or location-based restrictions.

You have three main components in Conditional Access:

  • Signals: These are the conditions that trigger a policy. Signals can include user sign-in, device state, location, and more.
  • Decisions: These are the actions that are taken when a policy is triggered. Decisions can include requiring multi-factor authentication, blocking access, or granting access with conditions.
  • Enforcement: This is the mechanism that enforces the policy. Enforcement can be done at the application level, the network level, or the device level.

Really, all the Conditional Access policies are based on the following flow:

  1. Assignments: Define who and where the policy applies to.
  2. Access Controls: Define what to do when the the who and where are met.

For that reason, we can define the followin phases:

  • Phase 1: Collect session details
  • Assignments: Define who and where the policy applies to.

    • users: users and groups selected and excluded
    • Target resources: Control access base on Cloud apps, actions, and authentication context.
    • Cloud apps: Include and exclude. Many of the existing Microsoft cloud applications are included in the list of applications that can be targeted by Conditional Access policies: Office 365, Windows Azure Service Management API, Microsoft Admin Portals and Others
    • User actions: Tasks that a user performs: Register security information and Register or join devices
    • Global Secure Access: Targeting members in your tenant with Global Secure Access (GSA) as a resource,enable administrators to define and control how traffic is routed through Microsoft Entra Internet Access and Microsoft Entra Private Access.
    • Authentication context: With this option you can define a authentication context for the policy, for example, you can define that the policy is applied only when the user is accessing to Higly Confidential information. You can create labels with ID c1 to c99 and tag your information with these labels. Not all apps support authentication contexts, you can check the official documentation to see which apps support it.
    • Network: Include and Exclude. Control user access based on their network location or physical location.
      • Any network or location: This option allows you to apply the policy to all network locations.
      • All trusted locations: This option allows you to apply the policy to all trusted locations.
      • All Compliant Network locations: This option allows you to apply the policy to all compliant network locations.
      • Selected network or location: This option allows you to apply the policy to selected network locations: Countries(GPS),Countries(IP) and IP ranges.
    • Conditions: Control access based in signals from conditions.
      • User risk: Control access based on the user risk level calculated by Microsoft Entra Identity Protection. User risk represents the probability that a given identity or account is compromised, for example: Published credentials in Dark Web.
      • Sign-in risk: Control access based on the sign-in risk level calculated by Microsoft Entra Identity Protection. Sign-in risk represents the probability that a given authentication request wasn't made by the identity owner. For example, sign-in from a new location or new device.
      • Insider risk: Control access for users who are assigned specific risk levels from Adaptive Protection, a Microsoft Purview Insider Risk Management feature. Insider risk represents the probability that a given user is engaged in risky data-related activities.
      • Device Platform: Include and Exclude. Control access based on the platform of the device used to sign in.
      • Client apps: Include and Exclude. Control access based on the client apps used to sign in.
      • Filters for devices: Include and Exclude. Control access based on configured filters to apply policy to specific devices
      • Authentication flows: Control access based on the authentication flow used to sign in:
      • Device code flow: Device code flow is a method of authentication that allows users to sign in to a device using a code displayed on the device. This flow is used for devices that don't have a browser or can't use a browser to sign in.
      • Authentication transfer: Authentication transfer is a new flow that offers a seamless way to transfer authenticated state from one device to another.
  • Phase 2: Enforcement

  • Access controls: Define what to do when the the who and where are met.
    • Grant: Control access enforcement to block or grant access.
    • Block: Block access to the resource.
    • Grant: Grant access to the resource. You can also define the following options:
      • Require multi-factor authentication: Require users to perform multi-factor authentication.
      • Require authentication strength: Require a combination of authentication methods to access the resource.
      • Require device to be marked as compliant: Require the device to be marked as compliant by Microsoft Entra Intune.
      • Require Hybrid Azure AD joined device: Require the device to be Hybrid Azure AD joined.
      • Require approved client app: Require the use of an approved client app.
      • Require app protection policy: Require that an Intune app protection policy is present on the client app before access is available to the selected applications
      • Require password change: Require the user to change their password.
    • For multiple controls, you can define the following options:
      • Require all the selected controls: Require all the selected controls to be met.
      • Require one of the selected controls: Require one of the selected controls to be met.
  • Session: Control access based on session controls to enable limited experiences within specific cloud applications.
    • Use app enforced restrictions: Enforce app restrictions to control access based on the app's own restrictions. When selected, the cloud app uses the device information to provide users with a limited or full experience. Limited when the device isn't managed or compliant and full when the device is managed and compliant.
    • Use Conditional Access App Control: Enforce real-time monitoring and control of user sessions. Conditional Access App Control enables user app access and sessions to be monitored and controlled in real time based on access and session policies. Access and session policies are used within the Defender for Cloud Apps portal to refine filters and set actions to take.
    • Sign-in frequency: Enforce sign-in frequency to control how often users are prompted to sign in. Sign-in frequency setting works with apps that implement OAUTH2 or OIDC protocols according to the standards.
    • Persistent browser session: Enforce persistent browser session to control users can remain signed in after closing and reopening their browser window.
    • Customize continuous access evaluation: Enable custom continuous access evaluation to revoke access tokens based on critical events and policy evaluation in real time.Continuous Access Evaluation (CAE) allows access tokens to be revoked based on critical events and policy evaluation in real time rather than relying on token expiration based on lifetime.

Example of Conditional Access policy configuration:

  1. Block access to all users from all locations except for a specific group of users from a specific location:
  2. Assignments:
    • users:
    • Include:All users
    • Exclude: Group_of_excluded_users
    • Target resources:
    • Cloud apps: All cloud apps
    • Network: All trusted locations
  3. Access controls:
    • Grant: Block

Mindmaps of the Conditional Access policies flow:

# Conditional Access Policy
## Phase 1: Collect session details
### Assignments
#### users
##### Include
###### None
###### All users
###### Select users and groups
##### Exclude
###### Guest or external users
###### Directory roles
###### Users and groups
#### Target resources
##### Cloud apps
###### Include
 - None
 - All cloud apps
 - Select apps
###### Exclude
  - Edit Filter
  - Select excluded cloud apps
##### User actions
  - Register security information
  - Register or join devices
##### Global Secure Access
  - Microsoft 365 traffic
  - Internet traffic
  - Private traffic
##### Authentication context
##### Network
###### Any network or location
###### All trusted locations
###### All Compliant Network locations
###### Selected network or location
#### Conditions
##### User risk
##### Sign-in risk
##### Insider risk
##### Device Platform
##### Client apps
##### Filters for devices
##### Authentication flows
###### Device code flow
###### Authentication transfer
## Phase 2: Enforcement
### Access controls
#### Grant
##### Block
##### Grant
###### Require multi-factor authentication
###### Require authentication strength
###### Require device to be marked as compliant
###### Require Hybrid Azure AD joined device
###### Require approved client app
###### Require app protection policy
###### Require password change
##### For multiple controls
###### Require all the selected controls
###### Require one of the selected controls
#### Session
##### Use app enforced restrictions
##### Use Conditional Access App Control
##### Sign-in frequency
##### Persistent browser session
##### Customize continuous access evaluation
mindmap
    root((Conditional Access Policy))
      (Phase 1: Collect session details)
        (Assignments)        
          [users]
            {{Include}}
              None
              All users
              Select users and groups
            {{Exclude}}
              Guest or external users
              Directory roles
              Users and groups
          [Target resources]
            {{Cloud apps}}
              Include
                None
                All cloud apps
                Select apps
              Exclude
                Edit Filter
                Select excluded cloud apps
            {{User actions}}
              Register security information
              Register or join devices
            {{Global Secure Access}}
              Microsoft 365 traffic
              Internet traffic
              Private traffic
            {{Authentication context}}
            {{Network}}
              Any network or location
              All trusted locations
              All Compliant Network locations
              Selected network or location
          [Conditions]
            {{User risk}}
            {{Sign-in risk}}
            {{Insider risk}}
            {{Device Platform}}
            {{Client apps}}
            {{Filters for devices}}
            {{Authentication flows}}
              Device code flow
              Authentication transfer
      (Phase 2: Enforcement)
        (Access controls)
          [Grant]
            {{Block}}
            {{Grant}}
              Require multi-factor authentication
              Require authentication strength
              Require device to be marked as compliant
              Require Hybrid Azure AD joined device
              Require approved client app
              Require app protection policy
              Require password change
            {{For multiple controls}}
              Require all the selected controls
              Require one of the selected controls
          [Session]
            {{Use app enforced restrictions}}
            {{Use Conditional Access App Control}}
            {{Sign-in frequency}}
            {{Persistent browser session}}
            {{Customize continuous access evaluation}}

Resources

Data threat modeling in Azure storage accounts

Info

I apologize in advance if this is a crazy idea and there is some mistake!! I am just trying to learn and share knowledge.

Azure Storage Account is a service that provides scalable, secure, and reliable storage for data. It is used to store data such as blobs, files, tables, and queues. However, it is important to ensure that the data stored in Azure Storage Account is secure and protected from security threats. In this article, we will discuss how to perform data threat modeling in Azure storage accounts.

What is data threat modeling?

Data threat modeling is a process of identifying and analyzing potential threats to data security. It helps organizations understand the risks to their data and develop strategies to mitigate those risks. Data threat modeling involves the following steps:

  1. Identify assets: Identify the data assets stored in Azure storage accounts, such as blobs, files, tables, and queues.
  2. Identify threats: Identify potential threats to the data assets, such as unauthorized access, data breaches, data leakage, malware, phishing attacks, insider threats, and data loss.
  3. Assess risks: Assess the risks associated with each threat, such as the likelihood of the threat occurring and the impact of the threat on the data assets.
  4. Develop mitigation strategies: Develop strategies to mitigate the risks, such as implementing security controls, access controls, encryption, monitoring, and auditing.

By performing data threat modeling, organizations can identify and address security vulnerabilities in Azure storage accounts and protect their data from security threats.

Identify assets in Azure storage accounts

Azure storage accounts can store various types of data assets, including:

  • Blobs: Binary large objects (blobs) are used to store unstructured data, such as images, videos, and documents.
  • Files: Files are used to store structured data, such as text files, configuration files, and log files.
  • Tables: Tables are used to store structured data in a tabular format, such as customer information, product information, and transaction data.
  • Queues: Queues are used to store messages for communication between applications, such as task messages, notification messages, and status messages.
  • Disks: Disks are used to store virtual machine disks, such as operating system disks and data disks.

Identifying the data assets stored in Azure storage accounts is the first step in data threat modeling. It helps organizations understand the types of data stored in Azure storage accounts and the potential risks to those data assets.

Identify threats to data in Azure storage accounts

There are several threats to data stored in Azure storage accounts, including:

  • Unauthorized access: Unauthorized users gaining access to Azure storage accounts and stealing data.
  • Data breaches: Data breaches can expose sensitive data stored in Azure storage accounts.
  • Data leakage: Data leakage can occur due to misconfigured access controls or insecure data transfer protocols.
  • Data loss: Data loss can occur due to accidental deletion, corruption, or hardware failure.
  • Ransomware: Ransomware can encrypt data stored in Azure storage accounts and demand a ransom for decryption.
  • DDoS attacks: DDoS attacks can disrupt access to data stored in Azure storage accounts.
  • Phishing attacks: Phishing attacks can trick users into providing their login credentials, which can be used to access and steal data.
  • Malware: Malware can be used to steal data from Azure storage accounts and transfer it to external servers.
  • Insider threats: Employees or contractors with access to sensitive data may intentionally or unintentionally exfiltrate data.
  • Data exfiltration: Unauthorized transfer of data from Azure storage accounts to external servers.

For example, the flow of data exfiltration in Azure storage accounts can be summarized as follows:

sequenceDiagram
    participant User
    participant Azure Storage Account
    participant External Server

    User->>Azure Storage Account: Upload data
    Azure Storage Account->>External Server: Unauthorized transfer of data

In this flow, the user uploads data to the Azure Storage Account, and the data is then transferred to an external server without authorization. This unauthorized transfer of data is known as data exfiltration.

Assess risks to data in Azure storage accounts

Assessing the risks associated with threats to data in Azure storage accounts is an important step in data threat modeling. Risks can be assessed based on the likelihood of the threat occurring and the impact of the threat on the data assets. Risks can be categorized as low, medium, or high based on the likelihood and impact of the threat.

For example, the risk of unauthorized access to Azure storage accounts may be categorized as high if the likelihood of unauthorized access is high and the impact of unauthorized access on the data assets is high. Similarly, the risk of data leakage may be categorized as medium if the likelihood of data leakage is medium and the impact of data leakage on the data assets is medium.

By assessing risks to data in Azure storage accounts, organizations can prioritize security measures and develop strategies to mitigate the risks.

For example, the risk of data exfiltration in Azure storage accounts can be assessed as follows:

pie
    title Data Exfiltration Risk Assessment
    "Unauthorized Access" : 30
    "Data Breaches" : 20
    "Data Leakage" : 15
    "Malware" : 10
    "Phishing Attacks" : 10
    "Insider Threats" : 15

Develop mitigation strategies for data in Azure storage accounts

Developing mitigation strategies is an essential step in data threat modeling. Mitigation strategies help organizations protect their data assets from security threats and reduce the risks associated with those threats. Mitigation strategies could include the following:

  1. Implement access controls: Implement access controls to restrict access to Azure storage accounts based on user roles and permissions.
  2. Encrypt data: Encrypt data stored in Azure storage accounts to protect it from unauthorized access.
  3. Monitor and audit access: Monitor and audit access to Azure storage accounts to detect unauthorized access and data exfiltration.
  4. Implement security controls: Implement security controls, such as firewalls, network security groups, and intrusion detection systems, to protect data in Azure storage accounts.
  5. Use secure transfer protocols: Use secure transfer protocols, such as HTTPS, to transfer data to and from Azure storage accounts.
  6. Implement multi-factor authentication: Implement multi-factor authentication to protect user accounts from unauthorized access.
  7. Train employees: Train employees on data security best practices to prevent data exfiltration and other security threats.
  8. Backup data: Backup data stored in Azure storage accounts to prevent data loss due to accidental deletion or corruption.
  9. Update software: Keep software and applications up to date to protect data stored in Azure storage accounts from security vulnerabilities.
  10. Implement data loss prevention (DLP) policies: Implement DLP policies to prevent data leakage and unauthorized transfer of data from Azure storage accounts.

As it is not an easy task, Microsoft provides us with tools for this, in the case of using a security framework we can always use the MCSB (Microsoft Cloud Security Baseline) which is a set of guidelines and best practices for securing Azure services, including Azure storage accounts. The MCSB provides recommendations for securing Azure storage accounts, such as enabling encryption, implementing access controls, monitoring access, and auditing activities:

Control Domain ASB Control ID ASB Control Title Responsibility Feature Name
Asset Management AM-2 Use only approved services Customer Azure Policy Support
Backup and recovery BR-1 Ensure regular automated backups Customer Azure Backup
Backup and recovery BR-1 Ensure regular automated backups Customer Service Native Backup Capability
Data Protection DP-1 Discover, classify, and label sensitive data Customer Sensitive Data Discovery and Classification
Data Protection DP-2 Monitor anomalies and threats targeting sensitive data Customer Data Leakage/Loss Prevention
Data Protection DP-3 Encrypt sensitive data in transit Microsoft Data in Transit Encryption
Data Protection DP-4 Enable data at rest encryption by default Microsoft Data at Rest Encryption Using Platform Keys
Data Protection DP-5 Use customer-managed key option in data at rest encryption when required Customer Data at Rest Encryption Using CMK
Data Protection DP-6 Use a secure key management process Customer Key Management in Azure Key Vault
Identity Management IM-1 Use centralized identity and authentication system Microsoft Azure AD Authentication Required for Data Plane Access
Identity Management IM-1 Use centralized identity and authentication system Customer Local Authentication Methods for Data Plane Access
Identity Management IM-3 Manage application identities securely and automatically Customer Managed Identities
Identity Management IM-3 Manage application identities securely and automatically Customer Service Principals
Identity Management IM-7 Restrict resource access based on conditions Customer Conditional Access for Data Plane
Identity Management IM-8 Restrict the exposure of credential and secrets Customer Service Credential and Secrets Support Integration and Storage in Azure Key Vault
Logging and threat detection LT-1 Enable threat detection capabilities Customer Microsoft Defender for Service / Product Offering
Logging and threat detection LT-4 Enable network logging for security investigation Customer Azure Resource Logs
Network Security NS-2 Secure cloud services with network controls Customer Disable Public Network Access
Network Security NS-2 Secure cloud services with network controls Customer Azure Private Link
Privileged Access PA-7 Follow just enough administration(least privilege) principle Customer Azure RBAC for Data Plane
Privileged Access PA-8 Choose approval process for third-party support Customer Customer Lockbox

And part of MCSB can be complemented with Azure Well Architected Framework, which provides guidance on best practices for designing and implementing secure, scalable, and reliable cloud solutions. The Well Architected Framework includes security best practices for Azure storage accounts, such as implementing security controls, access controls, encryption, monitoring, and auditing:

  1. Enable Azure Defender for all your storage accounts: Azure Defender for Storage provides advanced threat protection for Azure storage accounts. It helps detect and respond to security threats in real-time.
  2. Turn on soft delete for blob data: Soft delete helps protect your blob data from accidental deletion. It allows you to recover deleted data within a specified retention period.
  3. Use Microsoft Entitlement Management to authorize access to blob data: Microsoft Entitlement Management provides fine-grained access control for Azure storage accounts. It allows you to define and enforce access policies based on user roles and permissions.
  4. Consider the principle of least privilege: When assigning permissions to a Microsoft Entitlement security principal through Azure RBAC, follow the principle of least privilege. Only grant the minimum permissions required to perform the necessary tasks.
  5. Use managed identities to access blob and queue data: Managed identities provide a secure way to access Azure storage accounts without storing credentials in your code.
  6. Use blob versioning or immutable blobs: Blob versioning and immutable blobs help protect your business-critical data from accidental deletion or modification.
  7. Restrict default internet access for storage accounts: Limit default internet access to Azure storage accounts to prevent unauthorized access.
  8. Enable firewall rules: Use firewall rules to restrict network access to Azure storage accounts. Only allow trusted IP addresses to access the storage account.
  9. Limit network access to specific networks: Limit network access to specific networks or IP ranges to prevent unauthorized access.
  10. Allow trusted Microsoft services to access the storage account: Allow only trusted Microsoft services to access the storage account to prevent unauthorized access.
  11. Enable the Secure transfer required option: Enable the Secure transfer required option on all your storage accounts to enforce secure connections.
  12. Limit shared access signature (SAS) tokens to HTTPS connections only: Limit shared access signature (SAS) tokens to HTTPS connections only to prevent unauthorized access.
  13. Avoid using Shared Key authorization: Avoid using Shared Key authorization to access storage accounts. Use Azure AD or SAS tokens instead.
  14. Regenerate your account keys periodically: Regenerate your account keys periodically to prevent unauthorized access.
  15. Create a revocation plan for SAS tokens: Create a revocation plan and have it in place for any SAS tokens that you issue to clients. This will help you revoke access to the storage account if necessary.
  16. Use near-term expiration times on SAS tokens: Use near-term expiration times on impromptu SAS, service SAS, or account SAS to limit the exposure of the token.

Mixed strategies for data protection in Azure storage accounts

Diagram of the mixed strategies for data protection in Azure storage accounts:

graph LR
    A[Asset Management] -->B(AM-2)
    B --> C[Use only approved services]
    C --> D[Azure Policy] 
    E[Backup and recovery] -->F(BR-1)
    F --> G[Ensure regular automated backups]
    G --> H[Azure Backup]
    G --> I[Service Native Backup Capability]
    I --> I1["Azure Storage Account Configuration"]
    I1 --> I11["Turn on soft delete for blob data"]
    I1 --> I12["Use blob versioning or immutable blobs"]
graph LR
    J[Data Protection] -->K(DP-1)
    K --> L[Discover, classify, and label sensitive data]    
    L --> M[Sensitive Data Discovery and Classification]
    M --> M1["Microsoft Pureview"]       
    J --> N(DP-2)
    N --> O[Monitor anomalies and threats targeting sensitive data]
    O --> P[Data Leakage/Loss Prevention]
    P --> P1["Microsoft Defender for Storage"]
    J --> Q(DP-3)
    Q --> R[Encrypt sensitive data in transit]
    R --> S[Data in Transit Encryption]
    S --> S1["Azure Storage Account Configuration"]
    S1 --> S2["Enforce minimum TLS version"]
    J --> T(DP-4)
    T --> U[Enable data at rest encryption by default]
    U --> V[Data at Rest Encryption Using Platform Keys]
    V --> WW["Azure Storage Account Configuration"]    
    J --> W(DP-5)
    W --> X[Use customer-managed key option in data at rest encryption when required]
    X --> Y[Data at Rest Encryption Using CMK]
    Y --> WW["Azure Storage Account Configuration"]    
    J --> Z(DP-6)
    Z --> AA[Use a secure key management process]
    AA --> AB[Key Management in Azure Key Vault]
    AB --> AC["DEPRECATED"]
graph LR  
    AC[Identity Management] -->AD(IM-1)
    AD --> AE[Use centralized identity and authentication system]
    AE --> AE1["Microsoft Entra ID"]
    AE --> AF[Microsoft Entra ID Authentication Required for Data Plane Access]
    AF --> AF1["Azure Storage Account Configuration"]
    AF1 --> AF2["Disable Allow Shared Key authorization"]
    AD --> AG[Local Authentication Methods for Data Plane Access]
    AG --> AG1["Azure Storage Account Configuration"]
    AG1 --> AG2["Don't use SFTP if you don't need it"]
    AC --> AH(IM-3)
    AH --> AI[Manage application identities securely and automatically]
    AI --> AJ[Managed Identities]
    AI --> AK[Service Principals]
    AK --> AK1["Rotate or regenerate service principal credentials"]
    AC --> AL(IM-7)
    AL --> AM[Restrict resource access based on conditions]
    AM --> AN[Microsoft Entra Conditional Access for Data Plane]
    AC --> AO(IM-8)
    AO --> AP[Restrict the exposure of credential and secrets]    
    AP --> AQ[Service Credential and Secrets Support Integration and Storage in Azure Key Vault]    
    AQ --> AK1
    click AK1 "https://github.com/Azure-Samples/KeyVault-Rotation-StorageAccountKey-PowerShell" "Open this in a new tab" _blank

graph LR
AR[Logging and threat detection] -->AS(LT-1)
    AS --> AT[Enable threat detection capabilities]
    AT --> AU[Microsoft Defender for Service / Product Offering]
    AU --> AU1["Microsoft Defender for Storage"]
    AR --> AV(LT-4)
    AV --> AW[Enable network logging for security investigation]
    AW --> AX[Azure Resource Logs]
    AX --> AX1["Azure Monitor"]
    AX --> AX2["Azure Activity Log"]
    click AU1 "https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-introduction" "Open this in a new tab" _blank
graph LR
    AY[Network Security] -->AZ(NS-2)
    AZ --> BA[Secure cloud services with network controls]
    BA --> BB["Azure Storage Account Configuration"]
    BB --> BB1[Disable Public Network Access]
    BB --> BB2[Allow trusted Microsoft services to access the storage account]
    BA --> BC[Azure Private Link]
    BC --> BC1["Azure Private Endpoint"]
    BA --> BD[Azure Network]
    BD --> BD1["Azure Service Endpoint"]
    BA --> BE["Network Security Perimeter"]

graph LR
    BD[Privileged Access] -->BE(PA-7)
    BE --> BF["Follow just enough administration(least privilege) principle"]
    BF --> BG[Azure RBAC for Data Plane]
    BG --> BG1["Azure RBAC"]
    BG1 --> BG2["Azure RBAC Roles"]
    BD --> BH(PA-8)
    BH --> BI[Choose approval process for third-party support]
    BI --> BJ[Customer Lockbox]
click BG2 "https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles/storage" "Open this in a new tab" _blank

Example of mixed strategies for data protection in Azure storage accounts

The following example illustrates how to implement mixed strategies for data protection in Azure storage accounts:

Conclusion

In conclusion, data threat modeling is an important process for identifying and addressing security vulnerabilities in Azure storage accounts. By identifying assets, threats, risks, and developing mitigation strategies, organizations can protect their data from security threats and ensure the security and integrity of their data assets. By following best practices and implementing security measures, organizations can prevent and detect data threats in Azure storage accounts and protect their data from security threats.

References