Azure Data Explorer Telegraf Output Plugin
Powerful performance with an easy integration, powered by Telegraf, the open source data connector built by InfluxData.
5B+
Telegraf downloads
#1
Time series database
Source: DB Engines
1B+
Downloads of InfluxDB
2,800+
Contributors
Table of Contents
Powerful Performance, Limitless Scale
Collect, organize, and act on massive volumes of high-velocity data. Any data is more valuable when you think of it as time series data. with InfluxDB, the #1 time series platform built to scale with Telegraf.
See Ways to Get Started
Azure Data Explorer is a data analytics service designed to work with large volumes of streaming data. It’s fully managed, scales automatically, and is designed to work with time series data. It also has built-in tools to analyze data and find trends and anomalies.
Why use a Telegraf plugin for Azure Data Explorer?
The Azure Data Explorer Telegraf Output Plugin sends data collected by any Telegraf input plugin to Azure Data Explorer. This makes it simple to transfer your data to an existing Azure Data Explorer database. This plugin gives you control over how you want to group your data into tables before it's ingested into Azure Data Explorer, so you can use the database more efficiently.
How to monitor Azure Data Explorer using the Telegraf plugin
To use this plugin, you need to create an Azure Data Explorer cluster and database and set endpoint_url
to the URI property of the Azure Data Explorer resource on Azure. Then you need to set database
to a previously created Azure Data Explorer database and timeout
to the operations timeout you choose.
There are two choices for grouping metrics, TablePerMetric
and SingleTable
. If you choose to group all of your metrics in one single table, you also need to set table_name
.
Azure Data Explorer Tables Schema
The schema of the Azure Data Explorer table will match the structure of the Telegraf Metric object. Here’s an example Azure Data Explorer command generated by the plugin:
.create-merge table ['table-name'] (['fields']:dynamic, ['name']:string, ['tags']:dynamic, ['timestamp']:datetime)
The corresponding table mapping would look like the following:
.create-or-alter table ['table-name'] ingestion json mapping 'table-name_mapping' '[{"column":"fields", "Properties":{"Path":"$[\'fields\']"}},{"column":"name", "Properties":{"Path":"$[\'name\']"}},{"column":"tags", "Properties":{"Path":"$[\'tags\']"}},{"column":"timestamp", "Properties":{"Path":"$[\'timestamp\']"}}]'
Powerful Performance, Limitless Scale
Collect, organize, and act on massive volumes of high-velocity data. Any data is more valuable when you think of it as time series data. with InfluxDB, the #1 time series platform built to scale with Telegraf.
See Ways to Get Started