Skip to main content
Applies to BloodHound Enterprise only The Microsoft Sentinel integration for BloodHound Enterprise enables security teams to ingest attack path data, audit logs, posture trends, and Tier Zero asset exposure into Microsoft Sentinel for centralized monitoring, investigation, and response. It’s available as a data connector that can be deployed to your Azure environment, with pre-built workbooks (dashboards) and analytics rules to visualize and act on the data.

Roles and permissions

To successfully deploy and use the Microsoft Sentinel integration, different Azure roles and permissions are required for various personas involved in the process. The following table outlines the key roles, their responsibilities, and the required permissions for each role:
RoleResponsibilitiesRequired permissions
Installer
  • Deploy resources for the Microsoft Sentinel solution resources from the Azure Marketplace
  • Subscription Owner on the target subscription.
Admin
  • Manage data connectors, including enable/disable actions and authentication settings
  • Maintain and troubleshoot the integration, including parameters, playbooks, workbooks, and analytics rules
  • Microsoft Sentinel Contributor on the Log Analytics workspace
  • Log Analytics Contributor to manage queries, tables, and saved searches
  • Resource Group Contributor on the target resource group

Optional:
  • User Access Administrator to assign RBAC roles when needed
  • Contributor for broader management of underlying Azure resources
User
  • Use the deployed solution in daily operations
  • View dashboards, alerts, incidents, and workbooks
  • Microsoft Sentinel Reader for view-only access to incidents and workbooks
  • Log Analytics Reader for read-only access to logs and query results
  • Microsoft Sentinel Responder if the user needs to update incident status, assign incidents, or run playbooks

Prerequisites

Before you begin the installation and configuration process, ensure the following prerequisites are met:
  • Active Azure subscription with permissions to deploy resources
  • Microsoft Sentinel workspace (Log Analytics Workspace) in a target resource group
  • BloodHound Enterprise tenant
  • BloodHound Enterprise non-personal API key/ID pair
  • Microsoft Entra ID application with the Monitoring Metrics Publisher role on the target resource group

Configure the integration

Follow the steps below to deploy and configure the Microsoft Sentinel integration for BloodHound Enterprise. This process involves deploying Azure resources, configuring authentication, and setting up data ingestion.
1

Create a Log Analytics Workspace

Create a Log Analytics Workspace to store the data ingested from BloodHound Enterprise. This workspace will be connected to Microsoft Sentinel for monitoring and analysis.
  1. Log in to the Azure Portal with an account that has the necessary permissions for Microsoft Sentinel and Log Analytics Workspace configurations.
  2. Navigate to the Log Analytics Workspace and click Create.
  3. Select subscription and resource group, then enter a workspace name.
    Create Log Analytics Workspace
  4. Click Review and Create.
  5. Add the Log Analytics Workspace in Sentinel:
    1. Navigate to Sentinel.
    2. Click Create.
    3. Select the newly created Log Analytics Workspace.
    4. Click Add.
2

Register a Microsoft Entra ID application

Register a Microsoft Entra ID application to authenticate the data connector with Microsoft Sentinel. This application will be granted the necessary permissions to publish data to Sentinel.
  1. Open Microsoft Entra ID.
  2. Go to App registrations > New registration.
  3. Enter an app name and choose Accounts in this organizational directory only. No redirect URI is necessary.
    Create Log Analytics Workspace
  4. Click Register.
  5. Copy the Application (client) ID and Directory (tenant) ID. You’ll need these later.
  6. Under Certificates & secrets, create a client secret and save its value immediately. It will not be shown again.
3

Assign required Azure role to the app

Assign the Monitoring Metrics Publisher role to the Microsoft Entra ID application on your resource group:
  1. Open your resource group.
  2. Go to Access control (IAM) > Add role assignment.
  3. Assign Monitoring Metrics Publisher to the Entra application.
  4. Select a user, group, or service principal to assign access to, then click Select members.
  5. Select the application that you created, then click Select.
  6. Click Review + Assign.
4

Deploy the workbook and analytics rules template

Before starting the deployment, go to the Log Analytics Workspace you created and note the name and location of the workspace. You will need this during deployment.
  1. Click the following link to open a preloaded ARM template in the Azure Portal: Deploy to Azure.
  2. Confirm the template opens on the Custom deployment page.
    Customize deployment template
5

Configure deployment parameters

Fill in the deployment parameters for the workbook and analytics rules template using the information from previous steps and your environment.
  1. Select the target subscription and resource group, then enter deployment parameters such as workspace name and workspace location.
  2. Click Review + create.
  3. Click Create to deploy the workbook and data connector resources.
6

Verify the deployment

Verify the workbook and analytics rules deployment before deploying the data connector template:
  1. In the Azure Portal, open Microsoft Sentinel and select the workspace where you deployed the template.
  2. Go to Workbooks under Threat management.
  3. If prompted to continue in Microsoft Defender, select the link to open Microsoft Defender portal.
    Microsoft Defender Portal
  4. If multiple Sentinel workspaces are available, select the integration workspace from the workspace selector in the top-right corner.
    Select Sentinel Workspace
    Microsoft Defender Portal Workspace Selector
  5. In Workbooks, open the Templates tab and verify the BloodHound workbook templates are available.
    Verify Workbooks
  6. Go to Configuration > Analytics > Rule templates and verify the BloodHound analytics rules are available.
    Verify Analytics Rules
  7. Go to Configuration > Data connectors and verify the BloodHound Data Connector is listed and connected.
    Verify Data Connector
7

Deploy the data connector template

After deploying the workbook and analytics rules template, configure the data connector with your BloodHound Enterprise API credentials and settings.
  1. Log in to the Azure Portal with an account that has the Owner role on the resource group.
  2. Click the following link to open a preloaded ARM template in the Azure Portal: Deploy to Azure.
  3. Confirm the template opens on the Custom deployment page.
    Customize deployment template
8

Configure data connector parameters

Fill in the deployment parameters:
Parameter NameDescription
SubscriptionThe Azure subscription to deploy the resources to.
Resource GroupThe name of the resource group where the resources will be deployed.
Function App NameThe name of the Azure Function App. This must be unique across Azure, since each instance requires its own Function App (for example, BloodHoundEnterprise-Maple).
Log Analytics Workspace NameThe name of the existing Log Analytics Workspace where you want to create a Data Collection Endpoint (DCE) and Data Collection Rule (DCR) for custom tables.
Bloodhound Tenant DomainThe URL for the BloodHound Enterprise tenant domain.
Bloodhound Token ID Secret ValueThe value for the BloodHound token ID. This value will be stored in an Azure Key Vault secret.
Bloodhound Token Key Secret ValueThe value for the BloodHound token key. This value will be stored in an Azure Key Vault secret.
Microsoft Entra Id Application App IdThe unique identifier for the Microsoft Entra ID application. This ID, also known as the Client ID, is used to authenticate your application to the Microsoft identity platform.
Microsoft Entra ID Application App SecretA confidential secret generated for your Microsoft Entra ID application. This secret, also known as the Client Secret, is used along with the App ID to prove the application’s identity when requesting an access token.
Lookup DaysSpecifies the number of days in the past for which the system should fetch data. A higher value means more historical data will be retrieved, which increases the time and compute resources required during the first iteration when setting up the system. This parameter sets the default lookback period when no previous timestamp is available.
Selected Bloodhound EnvironmentsThe selected BloodHound environments from which you want to fetch data. These should be provided as comma-separated values (e.g., Ghost.Corp, Phantom.Corp). The default value is All.
Selected Finding TypesThe selected Finding Types from which you want to fetch data. These should be provided as comma-separated values (e.g., T0MarkSensitive, T0GenericAll). The default value is All.
  1. Select the target subscription and resource group, then enter deployment parameters such as workspace name and workspace location.
  2. Click Review + create.
  3. Click Create to deploy the workbook and data connector resources.
9

Deploy the data connector code

The ARM template will deploy the necessary Azure resources for the data connector, but you will also need to deploy the Azure Function code that fetches data from BloodHound Enterprise and ingests it into Microsoft Sentinel.
  1. Download the BloodHoundAzureFunction.zip archive from the GitHub repository.
  2. Open your Function App in Azure Portal.
  3. In the left menu, select Deployment Center under the Deployment section. You will see multiple options for deployment, including:
    • GitHub: Connect your GitHub repository for continuous deployment.
    • Azure Repos: Connect your Azure DevOps repository for continuous deployment.
    • Publish files: Manually upload your function code for one-time deployment.
  4. Select Publish files, select the downloaded BloodHoundAzureFunction.zip archive, and click Save.
    Publish files
    After deployment, you should see the function code in the Functions section of your Function App.
10

Verify the deployment

After deploying the Azure Function code, manually run each function to verify that the functions are running correctly and able to fetch data from BloodHound Enterprise and push it into the custom tables in your Log Analytics Workspace.
  1. Navigate to the Overview page of your Function App and select one of the deployed functions.
    Test/Run
  2. Click the Code + Test tab.
  3. Click Test/Run.
  4. Click Run to execute the function.
    Run function
  5. Monitor the execution logs to confirm that the function is running successfully and fetching data from BloodHound Enterprise. You should see log entries indicating successful execution and data retrieval for each finding type.
    2025-10-31T08:57:55Z [Information] Collecting asset details for finding type: AzureT0MGGrantAppRoles
    2025-10-31T08:57:55Z [Information] Making GET request to https://<your-tenant>.bloodhoundenterprise.io/api/v2/assets/findings/AzureT0MGGrantAppRoles/title.md
    2025-10-31T08:57:55Z [Information] Response status code: 200
    2025-10-31T08:57:56Z [Information] Making GET request to https://<your-tenant>.bloodhoundenterprise.io/api/v2/assets/findings/AzureT0MGGrantAppRoles/short_description.md
    2025-10-31T08:57:56Z [Information] Response status code: 200
    2025-10-31T08:57:56Z [Information] Making GET request to https://<your-tenant>.bloodhoundenterprise.io/api/v2/assets/findings/AzureT0MGGrantAppRoles/short_remediation.md
    2025-10-31T08:57:57Z [Information] Response status code: 200
    2025-10-31T08:57:57Z [Information] Making GET request to https://<your-tenant>.bloodhoundenterprise.io/api/v2/assets/findings/AzureT0MGGrantAppRoles/long_remediation.md
    2025-10-31T08:57:57Z [Information] Response status code: 200
    
  6. Repeat this one-time manual process for each deployed function to ensure all functions are working correctly.

Validate the integration

Complete verification before operational use.
1

Verify connector resources

  1. In Azure Portal, open Function App and confirm your deployed app exists and all BloodHound functions are listed.
    Verify Function App
  2. Open Key Vault and confirm the connector vault exists and includes the expected secrets.
    Verify Key Vault
  3. Open Data Collection Endpoints and confirm the BloodHound endpoint exists.
    Verify Data Collection Endpoints
  4. Open Data Collection Rules and confirm the BloodHound rules exist.
    Verify Data Collection Rules
  5. Open your Log Analytics workspace and confirm these custom tables exist:
    • BHEAttackPathsData_CL
    • BHEAttackPathsTimelineData_CL
    • BHEAuditLogsData_CL
    • BHEFindingTrendsData_CL
    • BHEPostureHistoryData_CL
    • BHETierZeroAssetsData_CL
    Verify Custom Tables
2

Verify data connector ingestion

Complete these steps to start your Azure Function App and begin ingesting BloodHound Enterprise data into custom tables.
  1. Open your Function App and start it from Overview if it is stopped.
  2. Open your Log Analytics workspace and click Logs.
  3. Verify that you can see custom logs.
    Verify Custom Logs
3

Activate dashboards and analytics rules

  1. In Microsoft Sentinel, go to Workbooks. You must save each workbook before editing or operational use.
    Activate Dashboards
  2. To save each workbook, double-click the workbook, then click Save in the modal that displays.
    Save Workbook
  3. Open each workbook to confirm it loads data correctly. If you see errors, review the function execution logs and ensure the data connector is ingesting data into the custom tables.
    Verify Workbook Data
4

Verify analytics rules

  1. In the Sentinel workspace, navigate to Configuration > Analytics > Rule templates.
    Verify Analytics Rules
  2. To generate incidents, create and save each Analytics rule. Select any rule to open the right-side panel, then click Create rule.
    Save Analytics Rule
  3. Click Next: Set rule logic and keep the default values.
    Set Rule Logic
  4. Click Next: Incident settings and keep the default values.
    Incident Settings
  5. Click Next: Automated response and keep the default values.
    Automated Response
  6. Click Next: Review + Create and keep the default values.
    Review + Create
  7. Click Save.
    Save Analytics Rule
  8. Repeat this process for each Analytics rule. Incidents are generated only after the rules are created and saved.
  9. To check incidents after rules are created, navigate to Investigation & Responses > Incidents & Alerts > Incidents.
    Incidents & Alerts

Next steps

Explore the pre-built workbooks to visualize BloodHound Enterprise data and use the analytics rules to generate incidents for findings.