Send security logs and events from Tenzir to Microsoft’s cloud, where you can analyze them with Microsoft Sentinel (SIEM), create alerts with Azure Monitor, or query them with KQL.
All logs in Azure land in a Log Analytics Workspace. Microsoft Sentinel and Azure Monitor read from this workspace; they don’t store data themselves.
To get data into a workspace, Azure uses two components:
- A Data Collection Endpoint (DCE) receives your data via HTTPS. This is the URL Tenzir sends events to.
- A Data Collection Rule (DCR) transforms incoming data and routes it to a specific table in your workspace.
This separation lets you send all data to one DCE while routing different streams to different tables, or even different cost tiers, by configuring multiple DCRs.
Storage Tiers
Section titled “Storage Tiers”Tables in a Log Analytics Workspace can use different pricing plans:
| Plan | Use Case | Trade-offs |
|---|---|---|
| Analytics | Real-time alerting, dashboards, frequent queries | Higher cost per GB ingested |
| Auxiliary | Compliance, forensics, occasional hunting | ~90% cheaper, but no scheduled alerts |
The Sentinel Data Lake uses Auxiliary tables to store high-volume logs (like NetFlow or firewall allows) cost-effectively for up to 12 years.
Prerequisites
Section titled “Prerequisites”Before using this integration, set up the following in the Azure Portal:
- Log Analytics Workspace: Create a workspace if you don’t have one.
- Entra ID Application: Register an app to get your
tenant_id,client_id, andclient_secretfor authentication. - Data Collection Endpoint: Create a DCE in your region to get the ingestion URL.
- Custom Table: Create a table in your workspace to receive
the data (e.g.,
MyLogs_CL). - Data Collection Rule: Create a DCR that routes data from your DCE to your table.
- Permissions: Grant your Entra app the Monitoring Metrics Publisher role on the DCR.
Examples
Section titled “Examples”Send Suricata Alerts as OCSF to Sentinel
Section titled “Send Suricata Alerts as OCSF to Sentinel”Use to_azure_log_analytics to
forward Suricata alerts as OCSF Detection Findings for correlation in Sentinel:
from_file "/var/log/suricata/eve.json", follow=truewhere event_type == "alert"suricata::ocsf::mapto_azure_log_analytics \ tenant_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", client_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", client_secret="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", dce="https://my-dce.westeurope-1.ingest.monitor.azure.com", dcr="dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", stream="OCSF_DetectionFinding_CL"Use the Sentinel Data Lake for High-Volume Logs
Section titled “Use the Sentinel Data Lake for High-Volume Logs”For high-volume logs like NetFlow, DNS queries, or firewall allows that you need for compliance or hunting but not real-time alerting, use the cost-optimized Auxiliary tier:
- Create a custom table (e.g.,
NetFlow_CL). - In the Azure Portal, change the table’s plan from Analytics to Auxiliary.
- Create a DCR that routes to this table.
- Ship data with the same operator:
from_file "netflow.parquet"to_azure_log_analytics \ tenant_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", client_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", client_secret="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", dce="https://my-dce.westeurope-1.ingest.monitor.azure.com", dcr="dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", stream="NetFlow_CL"Auxiliary tables store data in Parquet format with retention up to 12 years, making them ideal for historical forensics and ad-hoc KQL queries.