Skip to main content
Version: Next


Sends events via the Microsoft Azure Logs Ingestion API.


azure-log-analytics --tenant-id <tenant-id> --client-id <client-id>
--client-secret <client-secret>
--dce <data-collection-endpoint>
--dcr <data-collection-rule-id>
--table <table-name>
[--batch-size <batch-size>]


The azure-log-analytics operator makes it possible to upload events to supported tables or to custom tables in Microsoft Azure.

The operator handles access token retrievals by itself and updates that token automatically, if needed.

--tenant-id <tenant-id>

The Microsoft Directory (tenant) ID, written as xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx.

--client-id <client-id>

The Microsoft Application (client) ID, written as xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx.

--client-secret <client-secret>

The client secret.

--dce <data-collection-endpoint>

The data collection endpoint URL.

--dcr <data-collection-rule-id>

The data collection rule ID, written as dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.

--table <table-name>

The table to upload events to.

--batch-size <batch-size>

The event batch size for each upload request. The Azure Logs Ingestion API takes at most 500MB per minute, making sensible batching of events a necessity.

Defaults to 8192.


Upload custom.mydata events to a table Custom-MyData:

| where #schema == "custom.mydata"
| azure-log-analytics --tenant-id 00a00a00-0a00-0a00-00aa-000aa0a0a000 \
--client-id 000a00a0-0aa0-00a0-0000-00a000a000a0 \
--client-secret xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx \
--dce \
--dcr dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx \
--table "Custom-MyData"