Skip to content

Send security logs and events from Tenzir to Microsoft’s cloud, where you can analyze them with Microsoft Sentinel (SIEM), create alerts with Azure Monitor, or query them with KQL.

Microsoft Sentinel / Log AnalyticsWorkspaceDataCollectionRule(DCR)Analytics TierData Lake TierStandardTableCustomTableAuxiliaryTableDataCollectionEndpoint(DCE)hot/interactivecold/long-term

All logs in Azure land in a Log Analytics Workspace. Microsoft Sentinel and Azure Monitor read from this workspace; they don’t store data themselves.

To get data into a workspace, Azure uses two components:

  1. A Data Collection Endpoint (DCE) receives your data via HTTPS. This is the URL Tenzir sends events to.
  2. A Data Collection Rule (DCR) transforms incoming data and routes it to a specific table in your workspace.

This separation lets you send all data to one DCE while routing different streams to different tables, or even different cost tiers, by configuring multiple DCRs.

Tables in a Log Analytics Workspace can use different pricing plans:

PlanUse CaseTrade-offs
AnalyticsReal-time alerting, dashboards, frequent queriesHigher cost per GB ingested
AuxiliaryCompliance, forensics, occasional hunting~90% cheaper, but no scheduled alerts

The Sentinel Data Lake uses Auxiliary tables to store high-volume logs (like NetFlow or firewall allows) cost-effectively for up to 12 years.

Before using this integration, set up the following in the Azure Portal:

  1. Log Analytics Workspace: Create a workspace if you don’t have one.
  2. Entra ID Application: Register an app to get your tenant_id, client_id, and client_secret for authentication.
  3. Data Collection Endpoint: Create a DCE in your region to get the ingestion URL.
  4. Custom Table: Create a table in your workspace to receive the data (e.g., MyLogs_CL).
  5. Data Collection Rule: Create a DCR that routes data from your DCE to your table.
  6. Permissions: Grant your Entra app the Monitoring Metrics Publisher role on the DCR.

Use to_azure_log_analytics to forward Suricata alerts as OCSF Detection Findings for correlation in Sentinel:

from_file "/var/log/suricata/eve.json", follow=true
where event_type == "alert"
suricata::ocsf::map
to_azure_log_analytics \
tenant_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
client_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
client_secret="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
dce="https://my-dce.westeurope-1.ingest.monitor.azure.com",
dcr="dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
stream="OCSF_DetectionFinding_CL"

Use the Sentinel Data Lake for High-Volume Logs

Section titled “Use the Sentinel Data Lake for High-Volume Logs”

For high-volume logs like NetFlow, DNS queries, or firewall allows that you need for compliance or hunting but not real-time alerting, use the cost-optimized Auxiliary tier:

  1. Create a custom table (e.g., NetFlow_CL).
  2. In the Azure Portal, change the table’s plan from Analytics to Auxiliary.
  3. Create a DCR that routes to this table.
  4. Ship data with the same operator:
from_file "netflow.parquet"
to_azure_log_analytics \
tenant_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
client_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
client_secret="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
dce="https://my-dce.westeurope-1.ingest.monitor.azure.com",
dcr="dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
stream="NetFlow_CL"

Auxiliary tables store data in Parquet format with retention up to 12 years, making them ideal for historical forensics and ad-hoc KQL queries.

Last updated: