Skip to content

2024/06

When to use service principals

In Azure, there are two ways to authenticate your applications and services: service principals and managed identities.

Basically, don't use service principals if you can use managed identities. Managed identities are a more secure way to authenticate your applications and services in Azure. However, there are some scenarios where you need to use service principals instead of managed identities. Here are some scenarios where you should use service principals:

flowchart TD
    A[When to use service principals] --> B{Is your application or service running on-premises?}
    B --> |Yes| C{Is Azure Arc supported?}
    C --> |Yes| D[Use Managed Identity]
    C --> |No| E[Use Service Principal]
    B --> |No| F{Do the resources or applications support managed identities?}
    F --> |No| G[Use Service Principal]
    F --> |Yes| H[Use Managed Identity]

Conclusion

In conclusion, you should use managed identities whenever possible. However, if your application or service is running on-premises or if the resources or applications do not support managed identities, you should use service principals instead.

Microsoft Entra Conditional Access

In this post, I will show you how to configure Conditional Access in Microsoft Entra.

What is Conditional Access?

Conditional Access is a feature of Microsoft Entra that allows you to control access to your organization's resources based on specific conditions. With Conditional Access, you can enforce policies that require users to meet certain criteria before they can access resources, such as multi-factor authentication, device compliance, or location-based restrictions.

You have three main components in Conditional Access:

  • Signals: These are the conditions that trigger a policy. Signals can include user sign-in, device state, location, and more.
  • Decisions: These are the actions that are taken when a policy is triggered. Decisions can include requiring multi-factor authentication, blocking access, or granting access with conditions.
  • Enforcement: This is the mechanism that enforces the policy. Enforcement can be done at the application level, the network level, or the device level.

Really, all the Conditional Access policies are based on the following flow:

  1. Assignments: Define who and where the policy applies to.
  2. Access Controls: Define what to do when the the who and where are met.

For that reason, we can define the followin phases:

  • Phase 1: Collect session details
  • Assignments: Define who and where the policy applies to.

    • users: users and groups selected and excluded
    • Target resources: Control access base on Cloud apps, actions, and authentication context.
    • Cloud apps: Include and exclude. Many of the existing Microsoft cloud applications are included in the list of applications that can be targeted by Conditional Access policies: Office 365, Windows Azure Service Management API, Microsoft Admin Portals and Others
    • User actions: Tasks that a user performs: Register security information and Register or join devices
    • Global Secure Access: Targeting members in your tenant with Global Secure Access (GSA) as a resource,enable administrators to define and control how traffic is routed through Microsoft Entra Internet Access and Microsoft Entra Private Access.
    • Authentication context: With this option you can define a authentication context for the policy, for example, you can define that the policy is applied only when the user is accessing to Higly Confidential information. You can create labels with ID c1 to c99 and tag your information with these labels. Not all apps support authentication contexts, you can check the official documentation to see which apps support it.
    • Network: Include and Exclude. Control user access based on their network location or physical location.
      • Any network or location: This option allows you to apply the policy to all network locations.
      • All trusted locations: This option allows you to apply the policy to all trusted locations.
      • All Compliant Network locations: This option allows you to apply the policy to all compliant network locations.
      • Selected network or location: This option allows you to apply the policy to selected network locations: Countries(GPS),Countries(IP) and IP ranges.
    • Conditions: Control access based in signals from conditions.
      • User risk: Control access based on the user risk level calculated by Microsoft Entra Identity Protection. User risk represents the probability that a given identity or account is compromised, for example: Published credentials in Dark Web.
      • Sign-in risk: Control access based on the sign-in risk level calculated by Microsoft Entra Identity Protection. Sign-in risk represents the probability that a given authentication request wasn't made by the identity owner. For example, sign-in from a new location or new device.
      • Insider risk: Control access for users who are assigned specific risk levels from Adaptive Protection, a Microsoft Purview Insider Risk Management feature. Insider risk represents the probability that a given user is engaged in risky data-related activities.
      • Device Platform: Include and Exclude. Control access based on the platform of the device used to sign in.
      • Client apps: Include and Exclude. Control access based on the client apps used to sign in.
      • Filters for devices: Include and Exclude. Control access based on configured filters to apply policy to specific devices
      • Authentication flows: Control access based on the authentication flow used to sign in:
      • Device code flow: Device code flow is a method of authentication that allows users to sign in to a device using a code displayed on the device. This flow is used for devices that don't have a browser or can't use a browser to sign in.
      • Authentication transfer: Authentication transfer is a new flow that offers a seamless way to transfer authenticated state from one device to another.
  • Phase 2: Enforcement

  • Access controls: Define what to do when the the who and where are met.
    • Grant: Control access enforcement to block or grant access.
    • Block: Block access to the resource.
    • Grant: Grant access to the resource. You can also define the following options:
      • Require multi-factor authentication: Require users to perform multi-factor authentication.
      • Require authentication strength: Require a combination of authentication methods to access the resource.
      • Require device to be marked as compliant: Require the device to be marked as compliant by Microsoft Entra Intune.
      • Require Hybrid Azure AD joined device: Require the device to be Hybrid Azure AD joined.
      • Require approved client app: Require the use of an approved client app.
      • Require app protection policy: Require that an Intune app protection policy is present on the client app before access is available to the selected applications
      • Require password change: Require the user to change their password.
    • For multiple controls, you can define the following options:
      • Require all the selected controls: Require all the selected controls to be met.
      • Require one of the selected controls: Require one of the selected controls to be met.
  • Session: Control access based on session controls to enable limited experiences within specific cloud applications.
    • Use app enforced restrictions: Enforce app restrictions to control access based on the app's own restrictions. When selected, the cloud app uses the device information to provide users with a limited or full experience. Limited when the device isn't managed or compliant and full when the device is managed and compliant.
    • Use Conditional Access App Control: Enforce real-time monitoring and control of user sessions. Conditional Access App Control enables user app access and sessions to be monitored and controlled in real time based on access and session policies. Access and session policies are used within the Defender for Cloud Apps portal to refine filters and set actions to take.
    • Sign-in frequency: Enforce sign-in frequency to control how often users are prompted to sign in. Sign-in frequency setting works with apps that implement OAUTH2 or OIDC protocols according to the standards.
    • Persistent browser session: Enforce persistent browser session to control users can remain signed in after closing and reopening their browser window.
    • Customize continuous access evaluation: Enable custom continuous access evaluation to revoke access tokens based on critical events and policy evaluation in real time.Continuous Access Evaluation (CAE) allows access tokens to be revoked based on critical events and policy evaluation in real time rather than relying on token expiration based on lifetime.

Example of Conditional Access policy configuration:

  1. Block access to all users from all locations except for a specific group of users from a specific location:
  2. Assignments:
    • users:
    • Include:All users
    • Exclude: Group_of_excluded_users
    • Target resources:
    • Cloud apps: All cloud apps
    • Network: All trusted locations
  3. Access controls:
    • Grant: Block

Mindmaps of the Conditional Access policies flow:

# Conditional Access Policy
## Phase 1: Collect session details
### Assignments
#### users
##### Include
###### None
###### All users
###### Select users and groups
##### Exclude
###### Guest or external users
###### Directory roles
###### Users and groups
#### Target resources
##### Cloud apps
###### Include
 - None
 - All cloud apps
 - Select apps
###### Exclude
  - Edit Filter
  - Select excluded cloud apps
##### User actions
  - Register security information
  - Register or join devices
##### Global Secure Access
  - Microsoft 365 traffic
  - Internet traffic
  - Private traffic
##### Authentication context
##### Network
###### Any network or location
###### All trusted locations
###### All Compliant Network locations
###### Selected network or location
#### Conditions
##### User risk
##### Sign-in risk
##### Insider risk
##### Device Platform
##### Client apps
##### Filters for devices
##### Authentication flows
###### Device code flow
###### Authentication transfer
## Phase 2: Enforcement
### Access controls
#### Grant
##### Block
##### Grant
###### Require multi-factor authentication
###### Require authentication strength
###### Require device to be marked as compliant
###### Require Hybrid Azure AD joined device
###### Require approved client app
###### Require app protection policy
###### Require password change
##### For multiple controls
###### Require all the selected controls
###### Require one of the selected controls
#### Session
##### Use app enforced restrictions
##### Use Conditional Access App Control
##### Sign-in frequency
##### Persistent browser session
##### Customize continuous access evaluation
mindmap
    root((Conditional Access Policy))
      (Phase 1: Collect session details)
        (Assignments)        
          [users]
            {{Include}}
              None
              All users
              Select users and groups
            {{Exclude}}
              Guest or external users
              Directory roles
              Users and groups
          [Target resources]
            {{Cloud apps}}
              Include
                None
                All cloud apps
                Select apps
              Exclude
                Edit Filter
                Select excluded cloud apps
            {{User actions}}
              Register security information
              Register or join devices
            {{Global Secure Access}}
              Microsoft 365 traffic
              Internet traffic
              Private traffic
            {{Authentication context}}
            {{Network}}
              Any network or location
              All trusted locations
              All Compliant Network locations
              Selected network or location
          [Conditions]
            {{User risk}}
            {{Sign-in risk}}
            {{Insider risk}}
            {{Device Platform}}
            {{Client apps}}
            {{Filters for devices}}
            {{Authentication flows}}
              Device code flow
              Authentication transfer
      (Phase 2: Enforcement)
        (Access controls)
          [Grant]
            {{Block}}
            {{Grant}}
              Require multi-factor authentication
              Require authentication strength
              Require device to be marked as compliant
              Require Hybrid Azure AD joined device
              Require approved client app
              Require app protection policy
              Require password change
            {{For multiple controls}}
              Require all the selected controls
              Require one of the selected controls
          [Session]
            {{Use app enforced restrictions}}
            {{Use Conditional Access App Control}}
            {{Sign-in frequency}}
            {{Persistent browser session}}
            {{Customize continuous access evaluation}}

Resources

Data threat modeling in Azure storage accounts

Info

I apologize in advance if this is a crazy idea and there is some mistake!! I am just trying to learn and share knowledge.

Azure Storage Account is a service that provides scalable, secure, and reliable storage for data. It is used to store data such as blobs, files, tables, and queues. However, it is important to ensure that the data stored in Azure Storage Account is secure and protected from security threats. In this article, we will discuss how to perform data threat modeling in Azure storage accounts.

What is data threat modeling?

Data threat modeling is a process of identifying and analyzing potential threats to data security. It helps organizations understand the risks to their data and develop strategies to mitigate those risks. Data threat modeling involves the following steps:

  1. Identify assets: Identify the data assets stored in Azure storage accounts, such as blobs, files, tables, and queues.
  2. Identify threats: Identify potential threats to the data assets, such as unauthorized access, data breaches, data leakage, malware, phishing attacks, insider threats, and data loss.
  3. Assess risks: Assess the risks associated with each threat, such as the likelihood of the threat occurring and the impact of the threat on the data assets.
  4. Develop mitigation strategies: Develop strategies to mitigate the risks, such as implementing security controls, access controls, encryption, monitoring, and auditing.

By performing data threat modeling, organizations can identify and address security vulnerabilities in Azure storage accounts and protect their data from security threats.

Identify assets in Azure storage accounts

Azure storage accounts can store various types of data assets, including:

  • Blobs: Binary large objects (blobs) are used to store unstructured data, such as images, videos, and documents.
  • Files: Files are used to store structured data, such as text files, configuration files, and log files.
  • Tables: Tables are used to store structured data in a tabular format, such as customer information, product information, and transaction data.
  • Queues: Queues are used to store messages for communication between applications, such as task messages, notification messages, and status messages.
  • Disks: Disks are used to store virtual machine disks, such as operating system disks and data disks.

Identifying the data assets stored in Azure storage accounts is the first step in data threat modeling. It helps organizations understand the types of data stored in Azure storage accounts and the potential risks to those data assets.

Identify threats to data in Azure storage accounts

There are several threats to data stored in Azure storage accounts, including:

  • Unauthorized access: Unauthorized users gaining access to Azure storage accounts and stealing data.
  • Data breaches: Data breaches can expose sensitive data stored in Azure storage accounts.
  • Data leakage: Data leakage can occur due to misconfigured access controls or insecure data transfer protocols.
  • Data loss: Data loss can occur due to accidental deletion, corruption, or hardware failure.
  • Ransomware: Ransomware can encrypt data stored in Azure storage accounts and demand a ransom for decryption.
  • DDoS attacks: DDoS attacks can disrupt access to data stored in Azure storage accounts.
  • Phishing attacks: Phishing attacks can trick users into providing their login credentials, which can be used to access and steal data.
  • Malware: Malware can be used to steal data from Azure storage accounts and transfer it to external servers.
  • Insider threats: Employees or contractors with access to sensitive data may intentionally or unintentionally exfiltrate data.
  • Data exfiltration: Unauthorized transfer of data from Azure storage accounts to external servers.

For example, the flow of data exfiltration in Azure storage accounts can be summarized as follows:

sequenceDiagram
    participant User
    participant Azure Storage Account
    participant External Server

    User->>Azure Storage Account: Upload data
    Azure Storage Account->>External Server: Unauthorized transfer of data

In this flow, the user uploads data to the Azure Storage Account, and the data is then transferred to an external server without authorization. This unauthorized transfer of data is known as data exfiltration.

Assess risks to data in Azure storage accounts

Assessing the risks associated with threats to data in Azure storage accounts is an important step in data threat modeling. Risks can be assessed based on the likelihood of the threat occurring and the impact of the threat on the data assets. Risks can be categorized as low, medium, or high based on the likelihood and impact of the threat.

For example, the risk of unauthorized access to Azure storage accounts may be categorized as high if the likelihood of unauthorized access is high and the impact of unauthorized access on the data assets is high. Similarly, the risk of data leakage may be categorized as medium if the likelihood of data leakage is medium and the impact of data leakage on the data assets is medium.

By assessing risks to data in Azure storage accounts, organizations can prioritize security measures and develop strategies to mitigate the risks.

For example, the risk of data exfiltration in Azure storage accounts can be assessed as follows:

pie
    title Data Exfiltration Risk Assessment
    "Unauthorized Access" : 30
    "Data Breaches" : 20
    "Data Leakage" : 15
    "Malware" : 10
    "Phishing Attacks" : 10
    "Insider Threats" : 15

Develop mitigation strategies for data in Azure storage accounts

Developing mitigation strategies is an essential step in data threat modeling. Mitigation strategies help organizations protect their data assets from security threats and reduce the risks associated with those threats. Mitigation strategies could include the following:

  1. Implement access controls: Implement access controls to restrict access to Azure storage accounts based on user roles and permissions.
  2. Encrypt data: Encrypt data stored in Azure storage accounts to protect it from unauthorized access.
  3. Monitor and audit access: Monitor and audit access to Azure storage accounts to detect unauthorized access and data exfiltration.
  4. Implement security controls: Implement security controls, such as firewalls, network security groups, and intrusion detection systems, to protect data in Azure storage accounts.
  5. Use secure transfer protocols: Use secure transfer protocols, such as HTTPS, to transfer data to and from Azure storage accounts.
  6. Implement multi-factor authentication: Implement multi-factor authentication to protect user accounts from unauthorized access.
  7. Train employees: Train employees on data security best practices to prevent data exfiltration and other security threats.
  8. Backup data: Backup data stored in Azure storage accounts to prevent data loss due to accidental deletion or corruption.
  9. Update software: Keep software and applications up to date to protect data stored in Azure storage accounts from security vulnerabilities.
  10. Implement data loss prevention (DLP) policies: Implement DLP policies to prevent data leakage and unauthorized transfer of data from Azure storage accounts.

As it is not an easy task, Microsoft provides us with tools for this, in the case of using a security framework we can always use the MCSB (Microsoft Cloud Security Baseline) which is a set of guidelines and best practices for securing Azure services, including Azure storage accounts. The MCSB provides recommendations for securing Azure storage accounts, such as enabling encryption, implementing access controls, monitoring access, and auditing activities:

Control Domain ASB Control ID ASB Control Title Responsibility Feature Name
Asset Management AM-2 Use only approved services Customer Azure Policy Support
Backup and recovery BR-1 Ensure regular automated backups Customer Azure Backup
Backup and recovery BR-1 Ensure regular automated backups Customer Service Native Backup Capability
Data Protection DP-1 Discover, classify, and label sensitive data Customer Sensitive Data Discovery and Classification
Data Protection DP-2 Monitor anomalies and threats targeting sensitive data Customer Data Leakage/Loss Prevention
Data Protection DP-3 Encrypt sensitive data in transit Microsoft Data in Transit Encryption
Data Protection DP-4 Enable data at rest encryption by default Microsoft Data at Rest Encryption Using Platform Keys
Data Protection DP-5 Use customer-managed key option in data at rest encryption when required Customer Data at Rest Encryption Using CMK
Data Protection DP-6 Use a secure key management process Customer Key Management in Azure Key Vault
Identity Management IM-1 Use centralized identity and authentication system Microsoft Azure AD Authentication Required for Data Plane Access
Identity Management IM-1 Use centralized identity and authentication system Customer Local Authentication Methods for Data Plane Access
Identity Management IM-3 Manage application identities securely and automatically Customer Managed Identities
Identity Management IM-3 Manage application identities securely and automatically Customer Service Principals
Identity Management IM-7 Restrict resource access based on conditions Customer Conditional Access for Data Plane
Identity Management IM-8 Restrict the exposure of credential and secrets Customer Service Credential and Secrets Support Integration and Storage in Azure Key Vault
Logging and threat detection LT-1 Enable threat detection capabilities Customer Microsoft Defender for Service / Product Offering
Logging and threat detection LT-4 Enable network logging for security investigation Customer Azure Resource Logs
Network Security NS-2 Secure cloud services with network controls Customer Disable Public Network Access
Network Security NS-2 Secure cloud services with network controls Customer Azure Private Link
Privileged Access PA-7 Follow just enough administration(least privilege) principle Customer Azure RBAC for Data Plane
Privileged Access PA-8 Choose approval process for third-party support Customer Customer Lockbox

And part of MCSB can be complemented with Azure Well Architected Framework, which provides guidance on best practices for designing and implementing secure, scalable, and reliable cloud solutions. The Well Architected Framework includes security best practices for Azure storage accounts, such as implementing security controls, access controls, encryption, monitoring, and auditing:

  1. Enable Azure Defender for all your storage accounts: Azure Defender for Storage provides advanced threat protection for Azure storage accounts. It helps detect and respond to security threats in real-time.
  2. Turn on soft delete for blob data: Soft delete helps protect your blob data from accidental deletion. It allows you to recover deleted data within a specified retention period.
  3. Use Microsoft Entitlement Management to authorize access to blob data: Microsoft Entitlement Management provides fine-grained access control for Azure storage accounts. It allows you to define and enforce access policies based on user roles and permissions.
  4. Consider the principle of least privilege: When assigning permissions to a Microsoft Entitlement security principal through Azure RBAC, follow the principle of least privilege. Only grant the minimum permissions required to perform the necessary tasks.
  5. Use managed identities to access blob and queue data: Managed identities provide a secure way to access Azure storage accounts without storing credentials in your code.
  6. Use blob versioning or immutable blobs: Blob versioning and immutable blobs help protect your business-critical data from accidental deletion or modification.
  7. Restrict default internet access for storage accounts: Limit default internet access to Azure storage accounts to prevent unauthorized access.
  8. Enable firewall rules: Use firewall rules to restrict network access to Azure storage accounts. Only allow trusted IP addresses to access the storage account.
  9. Limit network access to specific networks: Limit network access to specific networks or IP ranges to prevent unauthorized access.
  10. Allow trusted Microsoft services to access the storage account: Allow only trusted Microsoft services to access the storage account to prevent unauthorized access.
  11. Enable the Secure transfer required option: Enable the Secure transfer required option on all your storage accounts to enforce secure connections.
  12. Limit shared access signature (SAS) tokens to HTTPS connections only: Limit shared access signature (SAS) tokens to HTTPS connections only to prevent unauthorized access.
  13. Avoid using Shared Key authorization: Avoid using Shared Key authorization to access storage accounts. Use Azure AD or SAS tokens instead.
  14. Regenerate your account keys periodically: Regenerate your account keys periodically to prevent unauthorized access.
  15. Create a revocation plan for SAS tokens: Create a revocation plan and have it in place for any SAS tokens that you issue to clients. This will help you revoke access to the storage account if necessary.
  16. Use near-term expiration times on SAS tokens: Use near-term expiration times on impromptu SAS, service SAS, or account SAS to limit the exposure of the token.

Mixed strategies for data protection in Azure storage accounts

Diagram of the mixed strategies for data protection in Azure storage accounts:

graph LR
    A[Asset Management] -->B(AM-2)
    B --> C[Use only approved services]
    C --> D[Azure Policy] 
    E[Backup and recovery] -->F(BR-1)
    F --> G[Ensure regular automated backups]
    G --> H[Azure Backup]
    G --> I[Service Native Backup Capability]
    I --> I1["Azure Storage Account Configuration"]
    I1 --> I11["Turn on soft delete for blob data"]
    I1 --> I12["Use blob versioning or immutable blobs"]
graph LR
    J[Data Protection] -->K(DP-1)
    K --> L[Discover, classify, and label sensitive data]    
    L --> M[Sensitive Data Discovery and Classification]
    M --> M1["Microsoft Pureview"]       
    J --> N(DP-2)
    N --> O[Monitor anomalies and threats targeting sensitive data]
    O --> P[Data Leakage/Loss Prevention]
    P --> P1["Microsoft Defender for Storage"]
    J --> Q(DP-3)
    Q --> R[Encrypt sensitive data in transit]
    R --> S[Data in Transit Encryption]
    S --> S1["Azure Storage Account Configuration"]
    S1 --> S2["Enforce minimum TLS version"]
    J --> T(DP-4)
    T --> U[Enable data at rest encryption by default]
    U --> V[Data at Rest Encryption Using Platform Keys]
    V --> WW["Azure Storage Account Configuration"]    
    J --> W(DP-5)
    W --> X[Use customer-managed key option in data at rest encryption when required]
    X --> Y[Data at Rest Encryption Using CMK]
    Y --> WW["Azure Storage Account Configuration"]    
    J --> Z(DP-6)
    Z --> AA[Use a secure key management process]
    AA --> AB[Key Management in Azure Key Vault]
    AB --> AC["DEPRECATED"]
graph LR  
    AC[Identity Management] -->AD(IM-1)
    AD --> AE[Use centralized identity and authentication system]
    AE --> AE1["Microsoft Entra ID"]
    AE --> AF[Microsoft Entra ID Authentication Required for Data Plane Access]
    AF --> AF1["Azure Storage Account Configuration"]
    AF1 --> AF2["Disable Allow Shared Key authorization"]
    AD --> AG[Local Authentication Methods for Data Plane Access]
    AG --> AG1["Azure Storage Account Configuration"]
    AG1 --> AG2["Don't use SFTP if you don't need it"]
    AC --> AH(IM-3)
    AH --> AI[Manage application identities securely and automatically]
    AI --> AJ[Managed Identities]
    AI --> AK[Service Principals]
    AK --> AK1["Rotate or regenerate service principal credentials"]
    AC --> AL(IM-7)
    AL --> AM[Restrict resource access based on conditions]
    AM --> AN[Microsoft Entra Conditional Access for Data Plane]
    AC --> AO(IM-8)
    AO --> AP[Restrict the exposure of credential and secrets]    
    AP --> AQ[Service Credential and Secrets Support Integration and Storage in Azure Key Vault]    
    AQ --> AK1
    click AK1 "https://github.com/Azure-Samples/KeyVault-Rotation-StorageAccountKey-PowerShell" "Open this in a new tab" _blank

graph LR
AR[Logging and threat detection] -->AS(LT-1)
    AS --> AT[Enable threat detection capabilities]
    AT --> AU[Microsoft Defender for Service / Product Offering]
    AU --> AU1["Microsoft Defender for Storage"]
    AR --> AV(LT-4)
    AV --> AW[Enable network logging for security investigation]
    AW --> AX[Azure Resource Logs]
    AX --> AX1["Azure Monitor"]
    AX --> AX2["Azure Activity Log"]
    click AU1 "https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-storage-introduction" "Open this in a new tab" _blank
graph LR
    AY[Network Security] -->AZ(NS-2)
    AZ --> BA[Secure cloud services with network controls]
    BA --> BB["Azure Storage Account Configuration"]
    BB --> BB1[Disable Public Network Access]
    BB --> BB2[Allow trusted Microsoft services to access the storage account]
    BA --> BC[Azure Private Link]
    BC --> BC1["Azure Private Endpoint"]
    BA --> BD[Azure Network]
    BD --> BD1["Azure Service Endpoint"]
    BA --> BE["Network Security Perimeter"]

graph LR
    BD[Privileged Access] -->BE(PA-7)
    BE --> BF["Follow just enough administration(least privilege) principle"]
    BF --> BG[Azure RBAC for Data Plane]
    BG --> BG1["Azure RBAC"]
    BG1 --> BG2["Azure RBAC Roles"]
    BD --> BH(PA-8)
    BH --> BI[Choose approval process for third-party support]
    BI --> BJ[Customer Lockbox]
click BG2 "https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles/storage" "Open this in a new tab" _blank

Example of mixed strategies for data protection in Azure storage accounts

The following example illustrates how to implement mixed strategies for data protection in Azure storage accounts:

Conclusion

In conclusion, data threat modeling is an important process for identifying and addressing security vulnerabilities in Azure storage accounts. By identifying assets, threats, risks, and developing mitigation strategies, organizations can protect their data from security threats and ensure the security and integrity of their data assets. By following best practices and implementing security measures, organizations can prevent and detect data threats in Azure storage accounts and protect their data from security threats.

References

Tagging best practices in Azure

In this post, I will show you some best practices for tagging resources in Azure.

What are tags?

Tags are key-value pairs that you can assign to your Azure resources to organize and manage them more effectively. Tags allow you to categorize resources in different ways, such as by environment, owner, or cost center, and to apply policies and automation based on these categories.

If you don't know anything about tags, you can read the official documentation to learn more about them.

Why use tags?

There are several reasons to use tags:

  • Organization: Tags allow you to organize your resources in a way that makes sense for your organization. You can use tags to group resources by environment, project, or department, making it easier to manage and monitor them.

  • Cost management: Tags allow you to track and manage costs more effectively. You can use tags to identify resources that are part of a specific project or department, and to allocate costs accordingly.

  • Automation: Tags allow you to automate tasks based on resource categories. You can use tags to apply policies, trigger alerts, or enforce naming conventions, making it easier to manage your resources at scale.

Best practices for tagging resources in Azure

Here are some best practices for tagging resources in Azure:

  • Use consistent naming conventions: Define a set of standard tags that you will use across all your resources. This will make it easier to search for and manage resources, and to apply policies and automation consistently.

  • Apply tags at resource creation: Apply tags to resources when you create them, rather than adding them later. This will ensure that all resources are tagged correctly from the start, and will help you avoid missing or incorrect tags.

  • Use tags to track costs: Use tags to track costs by project, department, or environment. This will help you allocate costs more effectively, and will make it easier to identify resources that are not being used or are costing more than expected.

  • Define tags by hierarchy: Define tags in a hierarchy that makes sense for your organization, from more general at level subscription to more specific at resource group level.

  • Use inherited tags: Use inherited tags to apply tags to resources automatically based on their parent resources. This will help you ensure that all resources are tagged consistently, and will reduce the risk of missing or incorrect tags. Exist Azure Policy to enforce inherited tags for example, you can check all in Assign policy definitions for tag compliance

  • Don't use tags for policy filtering: If you use Azure Policy, it's highly recommended not to use tag filtering in your policy rules when the policy relates to security setting, when you use tags to filter, resources without tag appear Compliance. Azure Policy exemptions or Azure Policy exclusions are recommended.

  • Don't use tags for replace naming convention gaps: Tags are not a replacement for naming conventions. Use tags to categorize resources, and use naming conventions to identify resources uniquely.

  • Use tags for automation: Use tags to trigger automation tasks, such as scaling, backup, or monitoring. You can use tags to define policies that enforce specific actions based on resource categories.

  • Don't go crazy adding tags: Don't add too many tags to your resources. Keep it simple and use tags that are meaningful and useful. Too many tags can make it difficult to manage. You can begin with a small set of tags and expand as needed, for example: Minimum Suggested Tags

  • Not all Azure services support tags: Keep in mind that not all Azure services support tags. You can check in the Tag support for Azure resources to see which services support tags.

Conclusion

By using tags, you can organize and manage your resources more effectively, track and manage costs more efficiently, and automate tasks based on resource categories. I hope this post has given you a good introduction to tagging best practices in Azure and how you can use tags to optimize your cloud environment.

Restrict managed disks from being imported or exported

In this post, I will show you how to restrict managed disks from being imported or exported in Azure.

What are managed disks?

Azure Managed Disks are block-level storage volumes that are managed by Azure and used with Azure Virtual Machines. Managed Disks are designed for high availability and durability, and they provide a simple and scalable way to manage your storage.

If you don't know anything about Azue Managed Disks, grab a cup of coffee( it will take you a while), you can read the official documentation to learn more about them.

Why restrict managed disks from being imported or exported?

There are several reasons to restrict managed disks from being imported or exported:

  • Security: By restricting managed disks from being imported or exported, you can reduce the risk of unauthorized access to your data.
  • Compliance: By restricting managed disks from being imported or exported, you can help ensure that your organization complies with data protection regulations.

How to restrict managed disks from being imported or exported

At deployment time

An example with azcli:

Create a managed disk with public network access disabled
## Create a managed disk with public network access disabled
az disk create --resource-group myResourceGroup --name myDisk --size-gb 128 --location eastus --sku Standard_LRS --no-wait --public-network-access disabled 
Create a managed disk with public network access disabled and private endpoint enabled

Follow Azure CLI - Restrict import/export access for managed disks with Private Links

At Scale

If you want to restrict managed disks from being imported or exported, you can use Azure Policy to enforce this restriction. Azure Policy is a service in Azure that you can use to create, assign, and manage policies that enforce rules and effects over your resources. By using Azure Policy, you can ensure that your resources comply with your organization's standards and service-level agreements.

To restrict managed disks from being imported or exported using Azure Policy, you can use or create a policy definition that specifies the conditions under which managed disks can be imported or exported. You can then assign this policy definition to a scope, such as a management group, subscription, or resource group, to enforce the restriction across your resources.

In this case we have a Built-in policy definition that restricts managed disks from being imported or exported Configure disk access resources with private endpoints

Conclusion

In this post, I showed you how to restrict managed disks from being imported or exported in Azure. By restricting managed disks from being imported or exported, you can reduce the risk of unauthorized access to your data and help ensure that your organization complies with data protection regulations.

Curiosly, restrict managed disks from being imported or exported, it's not a compliance check in the Microsoft cloud security benchmark but it's a good practice to follow.

Microsoft Entra ID Privileged Identity Management

In this post, I will show you how to use Microsoft Entra ID Privileged Identity Management (PIM) to manage and monitor access to privileged roles in your organization.

What is Microsoft Entra ID Privileged Identity Management?

Microsoft Entra ID Privileged Identity Management (PIM) is a service that helps you manage, control, and monitor access within your organization. It provides just-in-time privileged access to Azure AD and Azure resources, along with access reviews and monitoring capabilities.

Reasons to Use Microsoft Entra ID PIM

There are several reasons to use Microsoft Entra ID PIM:

  • Security: Microsoft Entra ID PIM helps you reduce the risk of unauthorized access to privileged roles.
  • Compliance: Microsoft Entra ID PIM helps you meet compliance requirements by providing access reviews and monitoring capabilities.
  • Efficiency: Microsoft Entra ID PIM provides just-in-time access to privileged roles, reducing the risk of unauthorized access.
  • Monitoring: Microsoft Entra ID PIM provides monitoring and alerts that help you detect and respond to suspicious activities.
  • Traceability: Microsoft Entra ID PIM provides an audit trail of all access requests and approvals, helping you track who has access to privileged roles.

Maybe you can't use Microsoft Entra ID PIM to log all changess that the users do in the system, but you can use Actor Change for that.

Key Features of Microsoft Entra ID PIM

Microsoft Entra ID PIM offers several key features that help you manage and monitor access to privileged roles:

  • Provide just-in-time privileged access to Microsoft Entra ID and Azure resources.
  • Assign time-bound access to resources using start and end dates.
  • Require approval to activate privileged roles.
  • Enforce multifactor authentication to activate any role.
  • Use justification to understand why users activate.
  • Get notifications when users activate privileged roles.
  • Conduct access reviews to ensure users still need roles.
  • Download audit history for internal or external audit.
  • Prevent removal of the last active Global Administrator and Privileged Role Administrator role assignments.

How to Use Microsoft Entra ID PIM

We can manage three types of "things" in Microsoft Entra ID PIM:

  • Microsoft Entra roles. Microsoft Entra roles are also referred to as directory roles. They include built-in and custom roles to manage Microsoft Entra ID and other Microsoft 365 online services.
  • PIM for Groups. To set up just-in-time access to the member and owner role of a Microsoft Entra security group. PIM for Groups provides an alternative way to set up PIM for Microsoft Entra roles and Azure roles. It also allows you to set up PIM for other permissions across Microsoft online services like Intune, Azure Key Vaults, and Microsoft Entra ID Protection.
  • Azure roles. Azure's role-based access control roles that grant access to management groups, subscriptions, resource groups, and resources.

Microsoft Entra ID roles

Note

If you want to use groups, you need create eligible groups, for now the maximum number of eligible groups is 500.

In the Microsoft Entra ID portal, you can manage Microsoft Entra ID roles by selecting the "Roles" tab. Here you can see all the roles that are available to you, as well as the roles that you have activated.

You can made the following actions:

  • Assign: Create assignments for Microsoft Entra ID roles, update existing assignments to ensure that users have just-in-time access to privileged roles.
  • Activate: Activate your eligible assignments to get just-in-time access to privileged roles.
  • Approve: List, approve, or deny activation requests for Microsoft Entra ID roles.
  • Audit: View and export a history of all assignments and activations done in PIM so you can identify attacks and stay compliant.

Ok, let's see how to assign a role to a group.

In the "Roles" tab, click on the role that you want to assign.

In Role settings, edit the settings for the role, modify Activation, Assignment, and Notification settings and then click "Update. These are the default settings:

Activation

Setting State
Activation maximum duration (hours) 8 hour(s)
On activation, require None
Require justification on activation Yes
Require ticket information on activation No
Require approval to activate No
Approvers None

Assignment

Setting State
Allow permanent eligible assignment Yes
Expire eligible assignments after -
Allow permanent active assignment Yes
Expire active assignments after -
Require Azure Multi-Factor Authentication on active assignment No
Require justification on active assignment Yes

Send notifications when members are assigned as eligible to this role:

Type Default recipients Additional recipients Critical emails only
Role assignment alert Admin None False
Notification to the assigned user (assignee) Assignee None False
Request to approve a role assignment renewal/extension Approver None False

Send notifications when members are assigned as active to this role:

Type Default recipients Additional recipients Critical emails only
Role assignment alert Admin None False
Notification to the assigned user (assignee) Assignee None False
Request to approve a role assignment renewal/extension Approver None False

Send notifications when eligible members activate this role:

Type Default recipients Additional recipients Critical emails only
Role activation alert Admin None False
Notification to activated user (requestor) Requestor None False
Request to approve an activation Approver None False

In the "Assignments" tab, click on "Add assignments". In Membership, select the scope type: Application, Device, Directory, Group, Service Principal or User (It's dependes of each role) and select the identity related that you want to assign the role to. We are going to select Directory and a eligible group, and then click "Next".

Info

App registrations are supported in PIM for Microsoft Entra ID roles but only in active assignments.

In the "Settings" tab, you can set the Assigment type: Eligible or I need to do something to activate and Active or I don't need to do nothing to activate my role. You can configure too if you want that the duration of the assignment is permanent or if you want to expire the assignment after a certain period of time. Once you're done, click "Assign".

You can set the activation settings, assignment settings, and notification settings for the assignment. Once you're done, click "Assign".

PIM for Groups

In the Microsoft Entra ID portal, you can manage PIM for Groups by selecting the "Groups" tab.

First you need to discover the groups that you want to manage. In the "Groups" tab, click on "Discover groups". In the "Discover groups" pane, you can search for groups by name. Once you've found the group you want to manage, click on "Add to PIM".

Info

You must be an owner of the group or have Microsoft Entra role that allows to discover and manage group in PIM.

Now, in the "Groups" tab, you can see all the groups that you have discovered.

In the "Groups" tab, click on the group that you want to manage. You can made the following actions under manage:

  • Assign: Create assignments for the member and owner role of the group, update existing assignments to ensure that users have just-in-time access to privileged roles.
  • Activate: Activate your eligible assignments to get just-in-time access to privileged roles.
  • Approve: List, approve, or deny activation requests for the member and owner role of the group.
  • Audit: View and export a history of all assignments and activations done in PIM so you can identify attacks and stay compliant.

Ok, let's see how to assign a role to a group.

In the "Groups" tab, click on the group that you want to assign a role to.

In Role settings, edit the settings for the role, modify Activation, Assignment, and Notification settings for members or users, these settings are the same that we saw in the Microsoft Entra ID roles section.

In the "Assignments" tab, click on "Add assignments". In Select role, select the scope type: Member or Owner and in Select members select the identity related that you want to assign the role to or groupm, if you select a group, you doesn't need a eligible group, and then click "Next".

In the "Settings" tab, you can set the Assigment type: Eligible or I need to do something to activate and Active or I don't need to do nothing to activate my role. You can configure too if you want that the duration of the assignment is permanent or if you want to expire the assignment after a certain period of time. Once you're done, click "Assign".

Info

App registrations are supported in PIM for Groups but only in active assignments. I think that this is very useful for IaC for example because you can assign a role to a service principal and you can activate the role when you need it.

Example of a sequence diagram of the activation process for a group:

sequenceDiagram
    participant User
    participant PIM
    participant Approver
    participant MFA
    participant Group

    User->>PIM: Logs in
    User->>PIM: Navigates to PIM and selects "My roles"
    User->>PIM: Selects eligible group
    User->>PIM: Requests activation
    PIM->>MFA: Requests multi-factor authentication
    MFA-->>User: MFA request
    User->>MFA: Provides MFA credentials
    MFA-->>PIM: MFA confirmation
    PIM->>User: Requests justification
    User->>PIM: Provides justification
    PIM->>Approver: Sends approval request
    Approver-->>PIM: Approval
    PIM-->>User: Activation confirmation
    PIM->>Group: Activates temporary membership
    Group-->>PIM: Activation confirmation
    PIM-->>User: Access granted

Azure roles

In the Microsoft Entra ID portal, you can manage Azure roles by selecting the "Azure resources" tab. Here you can see all the Azure roles that are available to you by scope, as well as the roles that you have activated.

First you need to select the Azure resource that you want to manage: Management groups, Subscriptions, Resource groups or Resources and follow Manage Resource.

Once here, same as the other sections, you can made the following actions for this scope:

  • Assign: Create assignments for the member and owner role of the group, update existing assignments to ensure that users have just-in-time access to privileged roles.
  • Activate: Activate your eligible assignments to get just-in-time access to privileged roles.
  • Approve: List, approve, or deny activation requests for the member and owner role of the group.
  • Audit: View and export a history of all assignments and activations done in PIM so you can identify attacks and stay compliant.

Ok, let's see how to assign a role to a identity in a scope.

Info

We already have a scope selected

In the "Assignments" tab, click on "Add assignments". In Select role, select the role that you want to assign and in Select members select the identity related that you want to assign the role to or group, if you select a group, you doesn't need a eligible group, and then click "Next".

In the "Settings" tab, you can set the Assigment type: Eligible or I need to do something to activate and Active or I don't need to do nothing to activate my role. You can configure too if you want that the duration of the assignment is permanent or if you want to expire the assignment after a certain period of time. Once you're done, click "Assign".

Info

App registrations are supported in PIM for Azure roles but only in active assignments. I think that this is very useful for IaC for example because you can assign a role to a service principal and you can activate the role when you need it.

Don't forget review settings for the role, modify Activation, Assignment, and Notification settings for members or users, these settings are the same that we saw in the Microsoft Entra ID roles section.

Conclusion

Microsoft Entra ID Privileged Identity Management (PIM) is a powerful service that helps you manage and monitor access to privileged roles in your organization. By using Microsoft Entra ID PIM, you can reduce the risk of unauthorized access, meet compliance requirements, and improve the efficiency of your organization. I hope this post has helped you understand how to use Microsoft Entra ID PIM and how it can benefit your organization.

If you want to play with Microsoft Enter ID PIM, you can use a test on a new tenant or the current one if you haven't used it before.

Securely connect Power BI to data sources with a VNet data gateway

In this post, I will show you how to securely connect Power BI to your Azure data services using a Virtual Network (VNet) data gateway.

What is a Virtual Network (VNet) data gateway?

The virtual network (VNet) data gateway helps you to connect from Microsoft Cloud services to your Azure data services within a VNet without the need of an on-premises data gateway. The VNet data gateway securely communicates with the data source, executes queries, and transmits results back to the service.

The Role of a VNet Data Gateway

A VNet data gateway acts as a bridge that allows for the secure flow of data between the Power Platform and external data sources that reside within a virtual network. This includes services such as SQL databases, file storage solutions, and other cloud-based resources. The gateway ensures that data can be transferred securely and reliably, without exposing the network to potential threats or breaches.

How It Works

graph LR
    User([User]) -->|Semantic Models| SM[Semantic Models]
    User -->|"Dataflows (Gen2)"| DF["Dataflows (Gen2)"]
    User -->|Paginated Reports| PR[Paginated Reports]
    SM --> PPVS[Power Platform VNET Service]
    DF --> PPVS
    PR --> PPVS
    PPVS --> MCVG[Managed Container
for VNet Gateway] MCVG -->|Interfaces with| SQLDB[(SQL Databases)] MCVG -->|Interfaces with| FS[(File Storage)] MCVG -->|Interfaces with| CS[(Cloud Services)] MCVG -.->|Secured by| SEC{{Security Features}} subgraph VNET_123 SQLDB FS CS SEC end classDef filled fill:#f96,stroke:#333,stroke-width:2px; classDef user fill:#bbf,stroke:#f66,stroke-width:2px,stroke-dasharray: 5, 5; class User user class SM,DF,PR,PPVS,MCVG,SQLDB,FS,CS,SEC filled

The process begins with a user leveraging Power Platform services like Semantic Models, Dataflows (Gen2), and Paginated Reports. These services are designed to handle various data-related tasks, from analysis to visualization. They connect to the Power Platform VNET Service, which is the heart of the operation, orchestrating the flow of data through the managed container for the VNet gateway.

This managed container is a secure environment specifically designed for the VNet gateway’s operations. It’s isolated from the public internet, ensuring that the data remains protected within the confines of the virtual network. Within this secure environment, the VNet gateway interfaces with the necessary external resources, such as SQL databases and cloud storage, all while maintaining strict security protocols symbolized by the padlock icon in our diagram.

If you need to connect to services on others VNets, you can use VNet peering to connect the VNets, and maybe access to on-premises resources using ExpressRoute or other VPN solutions.

The Benefits

By utilizing a VNet data gateway, organizations can enjoy several benefits:

  • Enhanced Security: The gateway provides a secure path for data, safeguarding sensitive information and complying with organizational security policies.
  • Network Isolation: The managed container and the virtual network setup ensure that the data does not traverse public internet spaces, reducing exposure to vulnerabilities.
  • Seamless Integration: The gateway facilitates smooth integration between Power Platform services and external data sources, enabling efficient data processing and analysis.

Getting Started

To set up a VNet data gateway, follow these steps:

Register Microsoft.PowerPlatform as a resource provider

Before you can create a VNet data gateway, you need to register the Microsoft.PowerPlatform resource provider. This can be done using the Azure portal or the Azure CLI.

az provider register --namespace Microsoft.PowerPlatform

Associate the subnet to Microsoft Power Platform

Create a VNet in your Azure subscription and a subnet where the VNet data gateway will be deployed. Next, you need to delegate subnet to service Microsoft.PowerPlatform/vnetaccesslinks.

Note

  • This subnet can't be shared with other services.
  • Five IP addresses are reserved in the subnet for basic functionality. You need to reserve additional IP addresses for the VNet data gateway, add more IPs for future gateways.
  • You need a role with the Microsoft.Network/virtualNetworks/subnets/join/action permission

This can be done using the Azure portal or the Azure CLI.

# Create a VNet and address prefix 10.0.0.0/24
az network vnet create --name MyVNet --resource-group MyResourceGroup --location eastus --address-prefixes 10.0.0.0/24


# Create a Netwrok Security Group
az network nsg create --name MyNsg --resource-group MyResourceGroup --location eastus

# Create a subnet with delegation to Microsoft.PowerPlatform/vnetaccesslinks and associate the NSG
az network vnet subnet create --name MySubnet --vnet-name MyVNet --resource-group MyResourceGroup --address-prefixes 10.0.0.1/27 --network-security-group MyNsg --delegations Microsoft.PowerPlatform/vnetaccesslinks

Create a VNet data gateway

Note

A Microsoft Power Platform User with with Microsoft.Network/virtualNetworks/subnets/join/action permission on the VNet is required. Network Contributor role is not necessary.

  1. Sign in to the Power BI homepage.
  2. In the top navigation bar, select the settings gear icon on the right.
  3. From the drop down, select the Manage connections and gateways page, in Resources and extensions.
  4. Select Create a virtual network data gateway..
  5. Select the license capacity, subscription, resource group, VNet and the Subnet. Only subnets that are delegated to Microsoft Power Platform are displayed in the drop-down list. VNET data gateways require a Power BI Premium capacity license (A4 SKU or higher or any P SKU) or Fabric license to be used (any SKU).
  6. By default, we provide a unique name for this data gateway, but you could optionally update it.
  7. Select Save. This VNet data gateway is now displayed in your Virtual network data gateways tab. A VNet data gateway is a managed gateway that could be used for controlling access to this resource for Power platform users.

Conclusion

The VNet data gateway is a powerful tool that enables secure data transfer between the Power Platform and external data sources residing within a virtual network. By leveraging this gateway, organizations can ensure that their data remains protected and compliant with security standards, all while facilitating seamless integration and analysis of data. If you are looking to enhance the security and reliability of your data connections, consider implementing a VNet data gateway in your environment.

Implementing policy as code with Open Policy Agent

In this post, I will show you how to implement policy as code with Open Policy Agent (OPA) and Azure.

What is Open Policy Agent?

Open Policy Agent (OPA) is an open-source, general-purpose policy engine that enables you to define and enforce policies across your cloud-native stack. OPA provides a high-level declarative language called Rego that you can use to write policies that are easy to understand and maintain.

Why use Open Policy Agent?

There are several reasons to use OPA:

  • Consistency: OPA allows you to define policies in a single place and enforce them consistently across your cloud-native stack.
  • Flexibility: OPA provides a flexible policy language that allows you to define policies that are tailored to your specific requirements.
  • Auditability: OPA provides a transparent and auditable way to enforce policies, making it easy to understand why a policy decision was made.
  • Integration: OPA integrates with a wide range of cloud-native tools and platforms, making it easy to enforce policies across your entire stack.

Getting started with Open Policy Agent

To get started with OPA, you need to install the OPA CLI and write some policies in Rego.

You can install the OPA CLI by downloading the binary from the OPA GitHub releases page, you can check the installation guide for more details.

Once you have installed the OPA CLI, you can write policies in Rego. Rego is a high-level declarative language that allows you to define policies in a clear and concise way.

Here's a simple example of a policy that enforces a naming convention for Azure resources:

package azure.resources

default allow = false

allow {
    input.resource.type == "Microsoft.Compute/virtualMachines"
    input.resource.name == "my-vm"
}

This policy allows resources of type Microsoft.Compute/virtualMachines with the name my-vm. You can write more complex policies that enforce a wide range of requirements, such as resource tagging, network security, and access control.

Integrating Open Policy Agent with Azure

To integrate OPA with Azure, you can use the Azure Policy service, which allows you to define and enforce policies across your Azure resources. You can use OPA to define custom policies that are not supported by Azure Policy out of the box, or to enforce policies across multiple cloud providers.

Conclusion

Open Policy Agent is a powerful tool that allows you to define and enforce policies across your cloud-native stack. By using OPA, you can ensure that your infrastructure is secure, compliant, and consistent, and that your policies are easy to understand and maintain. I hope this post has given you a good introduction to OPA and how you can use it to implement policy as code in your cloud-native environment.

Additional resources

I have created a GitHub repository with some examples of policies written in Rego that you can use as a starting point for your own policies.

References

FinOps for Azure: Optimizing Your Cloud Spend

In the cloud era, optimizing cloud spend has become a critical aspect of financial management. FinOps, a set of financial operations practices, empowers organizations to get the most out of their Azure investment. This blog post dives into the core principles of FinOps, explores the benefits it offers, and outlines practical strategies for implementing FinOps in your Azure environment.

Understanding the Cloud Cost Challenge

Traditional IT expenditure followed a capital expenditure (capex) model, where businesses purchased hardware and software upfront. Cloud computing introduces a paradigm shift with the operational expenditure (opex) model. Here, businesses pay for resources as they consume them, leading to variable and unpredictable costs.

FinOps tackles this challenge by providing a framework for managing cloud finances. It encompasses three key pillars:

  • People: Establish a FinOps team or designate individuals responsible for overseeing cloud costs. This team should possess a blend of cloud technical expertise, financial acumen, and business process knowledge.
  • Process: Define processes for budgeting, forecasting, and monitoring cloud expenses. This involves setting spending limits, creating chargeback models for different departments, and regularly reviewing cost reports.
  • Tools: Leverage Azure Cost Management, a suite of tools that provides granular insights into your Azure spending. It enables cost allocation by resource, service, department, or any other relevant dimension.

It's essential to adopt a FinOps mindset that encourages collaboration between finance, IT, and business teams to drive cost efficiency and value realization in the cloud.

It's important to note that FinOps is not just about cost-cutting; it's about optimizing cloud spending to align with business objectives and maximize ROI.

Azure Cost Management: Optimizing Your Azure Spending

Azure Cost Management empowers you to analyze your Azure spending patterns and identify cost-saving opportunities. Here's a glimpse into its key functionalities:

  • Cost Views: Generate comprehensive reports that categorize your Azure spending by various attributes like resource group, service, or department.
  • Cost Alerts: Set up proactive alerts to notify you when your spending exceeds predefined thresholds.
  • Reservations: Purchase reserved instances of frequently used Azure resources for significant upfront discounts.
  • Recommendations: Azure Cost Management analyzes your usage patterns and recommends potential cost-saving measures, such as rightsizing resources or leveraging spot instances.

The Power of Tags and Azure Policy

Tags are metadata labels that you can attach to your Azure resources to categorize and track them effectively. They play a pivotal role in FinOps by enabling you to:

  • Associate costs with specific departments, projects, or applications.
  • Identify unused or underutilized resources for potential cost savings.
  • Simplify cost allocation and chargeback processes.

Azure Policy helps enforce tagging standards and governance rules across your Azure environment. You can define policies that mandate specific tags for all resources, ensuring consistent cost allocation and data accuracy.

Benefits of Implementing FinOps

A well-defined FinOps strategy empowers you to:

  • Gain Visibility into Cloud Spending: Obtain a clear picture of your Azure expenditures, enabling informed budgeting and cost control decisions.
  • Optimize Cloud Costs: Identify and eliminate wasteful spending through cost-saving recommendations and proactive measures.
  • Improve Cloud Governance: Enforce tagging policies and spending limits to ensure responsible cloud resource utilization.
  • Align Cloud Spending with Business Value: Make data-driven decisions about cloud investments that support your business objectives.

Getting Started with FinOps in Azure

Implementing FinOps doesn't necessitate a complex overhaul. Here's a recommended approach:

  1. Establish a FinOps Team: Assemble a cross-functional team comprising representatives from finance, IT, and business departments.
  2. Set Clear Goals and Objectives: Define your FinOps goals, whether it's reducing costs by a specific percentage or improving budget forecasting accuracy.
  3. Leverage Azure Cost Management: Start by exploring Azure Cost Management to understand your current spending patterns.
  4. Implement Basic Tagging Standards: Enforce basic tagging policies to categorize your Azure resources for cost allocation purposes.
  5. Continuously Monitor and Refine: Regularly review your cloud cost reports and identify areas for further optimization.

By following these steps and embracing a FinOps culture, you can effectively manage your Azure expenses and derive maximum value from your cloud investment.

Toolchain for FinOps in Azure

To streamline your FinOps practices in Azure, consider leveraging the following tools:

graph LR
A[Financial Operations Practices] --> B{Cloud Spend Optimization}
B --> C{Cost Visibility}
B --> D{Cost Optimization}
B --> E{Governance}
C --> F{Azure Cost Management}
D --> G{Azure Advisor}
E --> H{Azure Policy}
F --> I{Cost Views}
F --> J{Cost Alerts}
G --> K{Cost Recommendations}
H --> L{Tag Enforcement}

This toolchain combines Azure Cost Management, Azure Advisor, and Azure Policy to provide a comprehensive suite of capabilities for managing your Azure spending effectively.

Highly recommended, you can check FinOps toolkit, it's a set of tools and best practices to help you implement FinOps in your organization, it includes tools for cost allocation, budgeting, and forecasting, as well as best practices for FinOps implementation.

Conclusion

FinOps is an essential practice for organizations leveraging Azure. It empowers you to make informed decisions about your cloud finances, optimize spending, and achieve your business goals. As an Azure Solutions Architect, I recommend that you establish a FinOps practice within your organization to unlock the full potential of Azure and achieve financial efficiency in the cloud.

This blog post provides a foundational understanding of FinOps in Azure.

By embracing FinOps, you can transform your cloud cost management practices and drive business success in the cloud era.

References