Fluentd and Redis Integration
Powerful performance with an easy integration, powered by Telegraf, the open source data connector built by InfluxData.
5B+
Telegraf downloads
#1
Time series database
Source: DB Engines
1B+
Downloads of InfluxDB
2,800+
Contributors
Table of Contents
Powerful Performance, Limitless Scale
Collect, organize, and act on massive volumes of high-velocity data. Any data is more valuable when you think of it as time series data. with InfluxDB, the #1 time series platform built to scale with Telegraf.
See Ways to Get Started
Input and output integration overview
The Fluentd Input Plugin gathers metrics from Fluentd’s in_monitor plugin endpoint. It provides insights into various plugin metrics while allowing for custom configurations to reduce series cardinality.
The Redis Time Series output plugin is designed to publish metrics to a Redis efficiently.
Integration details
Fluentd
This plugin gathers metrics from the Fluentd plugin endpoint provided by the in_monitor plugin. It reads data from the /api/plugin.json resource and allows exclusion of specific plugins based on their type.
Redis
The Redis output plugin writes metrics to the Redis server.
Configuration
Fluentd
[[inputs.fluentd]]
## This plugin reads information exposed by fluentd (using /api/plugins.json endpoint).
##
## Endpoint:
## - only one URI is allowed
## - https is not supported
endpoint = "http://localhost:24220/api/plugins.json"
## Define which plugins have to be excluded (based on "type" field - e.g. monitor_agent)
exclude = [
"monitor_agent",
"dummy",
]
Redis
[[outputs.redistimeseries]]
## The address of the RedisTimeSeries server.
address = "127.0.0.1:6379"
## Redis ACL credentials
# username = ""
# password = ""
# database = 0
## Timeout for operations such as ping or sending metrics
# timeout = "10s"
## Enable attempt to convert string fields to numeric values
## If "false" or in case the string value cannot be converted the string
## field will be dropped.
# convert_string_fields = true
## Optional TLS Config
# tls_ca = "/etc/telegraf/ca.pem"
# tls_cert = "/etc/telegraf/cert.pem"
# tls_key = "/etc/telegraf/key.pem"
# insecure_skip_verify = false
Input and output integration examples
Fluentd
- Basic Configuration: Set up the Fluentd Input Plugin to gather metrics from your Fluentd instance’s monitoring endpoint, ensuring you are able to track performance and usage statistics.
- Excluding Plugins: Use the
exclude
option to ignore specific plugins’ metrics that are not necessary for your monitoring needs, streamlining data collection and focusing on what matters. - Custom Plugin ID: Implement the
@id
parameter in your Fluentd configuration to maintain a consistentplugin_id
, which helps avoid issues with high series cardinality during frequent restarts.
Redis
- Metrics Storage: Utilize the Redis output plugin to store time-series metrics collected from various sources directly into a Redis database for quick retrieval and analysis.
- Dynamic Configuration: Adjust the
address
and other settings dynamically to publish metrics to different Redis instances based on the deployment environment. - String Field Conversion: Leverage the
convert_string_fields
option to automatically convert string metrics to numeric formats, ensuring that data is stored in the desired type for analytics.
Feedback
Thank you for being part of our community! If you have any general feedback or found any bugs on these pages, we welcome and encourage your input. Please submit your feedback in the InfluxDB community Slack.
Powerful Performance, Limitless Scale
Collect, organize, and act on massive volumes of high-velocity data. Any data is more valuable when you think of it as time series data. with InfluxDB, the #1 time series platform built to scale with Telegraf.
See Ways to Get Started
Related Integrations
Related Integrations
HTTP and InfluxDB Integration
The HTTP plugin collects metrics from one or more HTTP(S) endpoints. It supports various authentication methods and configuration options for data formats.
View IntegrationKafka and InfluxDB Integration
This plugin reads messages from Kafka and allows the creation of metrics based on those messages. It supports various configurations including different Kafka settings and message processing options.
View IntegrationKinesis and InfluxDB Integration
The Kinesis plugin allows for reading metrics from AWS Kinesis streams. It supports multiple input data formats and offers checkpointing features with DynamoDB for reliable message processing.
View Integration