Tenzir comes with a wide range of built-in pipeline operators.
Shows the least common values.
Reverses the event order.
Sorts events by the given expressions.
sort name, -abs(transaction)
Groups events and applies aggregate functions to each group.
summarize name, sum(amount)
Shows the most common values.
Creates a Bloom filter context.
context::create_bloom_filter "ctx", capacity=1Mi, fp_probability=0.01
Creates a GeoIP context.
context::create_geoip "ctx", db_path="GeoLite2-City.mmdb"
Creates a lookup table context.
context::create_lookup_table "ctx"
Enriches events with data from a context.
context::enrich "ctx", key=x
Removes entries from a context.
context::erase "ctx", key=x
Updates a context with new data.
context::update "ctx", key=x, value=y
Compresses a stream of bytes.
Compresses a stream of bytes using Brotli compression.
compress_brotli, level=10
Compresses a stream of bytes using bz2 compression.
Compresses a stream of bytes using gzip compression.
Compresses a stream of bytes using lz4 compression.
Compresses a stream of bytes using zstd compression.
Decompresses a stream of bytes.
Decompresses a stream of bytes in the Brotli format.
Decompresses a stream of bytes in the Bzip2 format.
Decompresses a stream of bytes in the Gzip format.
Decompresses a stream of bytes in the Lz4 format.
Decompresses a stream of bytes in the Zstd format.
Drops events and emits a warning if the invariant is violated.
assert name.starts_with("John")
Emits a warning if the pipeline does not have the expected throughput
assert_throughput 1000, within=1s
Removes duplicate events based on a common key.
Limits the input to the first n
events.
Dynamically samples events from a event stream.
sample 30s, max_samples=2k
Keeps a range of events within the interval [begin, end)
stepping by stride
.
Limits the input to the last n
events.
Limits the input to n
events per unique schema.
Keeps only events for which the given predicate is true.
where name.starts_with("John")
Runs a pipeline periodically according to a cron expression.
cron "* */10 * * * MON-FRI" { from "https://example.org" }
Delays events relative to a given start time, with an optional speedup.
Discards all incoming events.
Runs a pipeline periodically at a fixed interval.
every 10s { summarize sum(amount) }
Executes a subpipeline with a copy of the input.
Routes the data to one of multiple subpipelines.
load_balance $over { publish $over }
Does nothing with the input.
Repeats the input a number of times.
Limits the bandwidth of a pipeline.
throttle 100M, within=1min
Use Tenzir's REST API directly from a pipeline.
The batch
operator controls the batch size of events.
An in-memory buffer to improve handling of data spikes in upstream operators.
buffer 10M, policy="drop"
An in-memory cache shared between pipelines.
cache "w01wyhTZm3", ttl=10min
Provides a compatibility fallback to TQL1 pipelines.
Forces a pipeline to run locally.
Replaces the input with metrics describing the input.
Forces a pipeline to run remotely at a node.
Make events available under the /serve
REST API endpoint
Treats all warnings as errors.
Removes ordering assumptions from a pipeline.
unordered { read_ndjson }
Removes fields from the event.
Add a field with the number of preceding events.
Sends HTTP/1.1 requests and forwards the response.
Moves values from one field to another, removing the original field.
move id=parsed_id, ctx.message=incoming.status
Selects some values and discards the rest.
select name, id=metadata.id
Assigns a value to a field, creating it if necessary.
Adjusts timestamps relative to a given start time, with an optional speedup.
timeshift ts, start=2020-01-01
Returns a new event for each member of a list or a record in an event, duplicating the surrounding event.
Parses bytes as BITZ format.
Parses an incoming Common Event Format (CEF) stream into events.
Read CSV (Comma-Separated Values) from a byte stream.
Parses an incoming bytes stream into events using a string as delimiter.
Parses an incoming bytes stream into events using a regular expression as delimiter.
read_delimited_regex r"\s+"
Parses an incoming Feather byte stream into events.
Parses an incoming GELF stream into events.
Parses lines of input with a grok pattern.
read_grok "%{IP:client} %{WORD:action}"
Parses an incoming JSON stream into events.
read_json arrays_of_objects=true
Read Key-Value pairs from a byte stream.
read_kv r"(\s+)[A-Z_]+:", r":\s*"
Parses an incoming [LEEF][leef] stream into events.
Parses an incoming bytes stream into events.
Parses an incoming NDJSON (newline-delimited JSON) stream into events.
Reads events from a Parquet byte stream.
Reads raw network packets in PCAP file format.
Read SSV (Space-Separated Values) from a byte stream.
read_ssv header="name count"
Parse an incoming [Suricata EVE JSON][eve-json] stream into events.
Parses an incoming Syslog stream into events.
Read TSV (Tab-Separated Values) from a byte stream.
read_tsv auto_expand=true
Read XSV from a byte stream.
Parses an incoming YAML stream into events.
Parse an incoming Zeek JSON stream into events.
Parses an incoming Zeek TSV
stream into events.
Writes events in BITZ format.
Transforms event stream to CSV (Comma-Separated Values) byte stream.
Transforms the input event stream to Feather byte stream.
Transforms the input event stream to a JSON byte stream.
Writes events in a Key-Value format.
Writes events as key-value pairsthe values of an event.
Transforms the input event stream to a Newline-Delimited JSON byte stream.
Transforms event stream to a Parquet byte stream.
Transforms event stream to PCAP byte stream.
Transforms event stream to SSV (Space-Separated Values) byte stream.
Transforms the input event stream to a TQL notation byte stream.
Transforms event stream to TSV (Tab-Separated Values) byte stream.
Transforms event stream to XSV byte stream.
Transforms the input event stream to YAML byte stream.
Transforms event stream into Zeek Tab-Separated Value byte stream.
Loads a byte stream via AMQP messages.
Loads bytes from Azure Blob Storage.
load_azure_blob_storage "abfs://container/file"
Loads the contents of the file at path
as a byte stream.
load_file "/tmp/data.json"
Loads a byte stream via FTP.
load_ftp "ftp.example.org"
Loads bytes from a Google Cloud Storage object.
load_gcs "gs://bucket/object.json"
Subscribes to a Google Cloud Pub/Sub subscription and obtains bytes.
load_google_cloud_pubsub project_id="my-project"
Loads a byte stream via HTTP.
load_http "example.org", params={n: 5}
Loads a byte stream from a Apache Kafka topic.
load_kafka topic="example"
Loads bytes from a network interface card (NIC).
Loads from an Amazon S3 object.
load_s3 "s3://my-bucket/obj.csv"
Loads bytes from [Amazon SQS][sqs] queues.
Accepts bytes from standard input.
Loads bytes from a TCP or TLS connection.
load_tcp "0.0.0.0:8090" { read_json }
Loads bytes from a UDP socket.
Receives ZeroMQ messages.
Obtains events from an URI, inferring the source, compression and format.
Reads one or multiple files from a filesystem.
from_file "s3://data/**.json"
Receives events via Fluent Bit.
from_fluent_bit "opentelemetry"
Sends and receives HTTP/1.1 requests.
Receives events via Opensearch Bulk API.
Submits VQL to a Velociraptor server and returns the response as events.
from_velociraptor subscribe="Windows"
Retrieves diagnostic events from a Tenzir node.
Retrieves metrics events from a Tenzir node.
Shows the node's OpenAPI specification.
Shows all available plugins and built-ins.
Shows the current version.
Retrieves events from a Tenzir node.
Retrieves all fields stored at a node.
Imports events into a Tenzir node.
Retrieves metadata about events stored at a node.
partitions src_ip == 1.2.3.4
Retrieves all schemas for events stored at a node.
Saves a byte stream via AMQP messages.
Saves bytes to Azure Blob Storage.
save_azure_blob_storage "abfs://container/file"
Saves bytes through an SMTP server.
save_email "user@example.org"
Writes a byte stream to a file.
save_file "/tmp/out.json"
Saves a byte stream via FTP.
save_ftp "ftp.example.org"
Saves bytes to a Google Cloud Storage object.
save_gcs "gs://bucket/object.json"
Publishes to a Google Cloud Pub/Sub topic.
save_google_cloud_pubsub project_id="my-project"
Sends a byte stream via HTTP.
save_http "example.org/api"
Saves a byte stream to a Apache Kafka topic.
save_kafka topic="example"
Saves bytes to an Amazon S3 object.
save_s3 "s3://my-bucket/obj.csv"
Saves bytes to [Amazon SQS][sqs] queues.
Writes a byte stream to standard output.
Saves bytes to a TCP or TLS connection.
save_tcp "0.0.0.0:8090", tls=true
Saves bytes to a UDP socket.
Sends bytes as ZeroMQ messages.
Saves to an URI, inferring the destination, compression and format.
Sends events to Amazon Security Lake (ASL).
Sends events to the Microsoft Azure Logs Ingestion API.
to_azure_log_analytics tenant_id="...", workspace_id="..."
Sends events to a ClickHouse table.
to_clickhouse table="my_table"
Sends events via Fluent Bit.
to_fluent_bit "elasticsearch" …
Sends events to Google Cloud Logging.
to_google_cloud_logging …
Sends unstructured events to a Google SecOps Chronicle instance.
Writes events to a URI using hive partitioning.
to_hive "s3://…", partition_by=[x]
Sends events to an OpenSearch-compatible Bulk API.
to_opensearch "localhost:9200", …
Sends events to a Snowflake database.
to_snowflake account_identifier="…
Sends events to a Splunk [HTTP Event Collector (HEC)][hec].
to_splunk "localhost:8088", …