Dear all,
I am new to influx and telegraf and we are pretty much in MS Azure PaaS services.
Since a while the Microsoft Azure EventHub provides Kafka 1.0 and later protocol support. Data streaming with Event Hubs using the Kafka protocol
To receive events and send to InfluxDB I configured the telegraf [inputs.kafka_consumer] as follows.
[[inputs.kafka_consumer]]
# ## kafka servers
brokers = ["<namespace>.servicebus.windows.net:9093"]
# ## topic(s) to consume
topics = ["<Event Hub>"]
# ## Add topic as tag if topic_tag is not empty
# # topic_tag = ""
#
# ## Optional Client id
client_id = "AzEvHub"
#
# ## Set the minimal supported Kafka version. Setting this enables the use of new
# ## Kafka features and APIs. Of particular interest, lz4 compression
# ## requires at least version 0.10.0.0.
# ## ex: version = "1.1.0"
version = "1.1.0"
#
# ## Optional TLS Config
# # tls_ca = "/etc/telegraf/ca.pem"
# # tls_cert = "/etc/telegraf/cert.pem"
# # tls_key = "/etc/telegraf/key.pem"
# ## Use TLS but skip chain & host verification
# # insecure_skip_verify = false
#
# ## Optional SASL Config
sasl_username = "<username>"
sasl_password = "<password>"
#
# ## the name of the consumer group
consumer_group = "telegraf"
# ## Offset (must be either "oldest" or "newest")
offset = "oldest"
# ## Maximum length of a message to consume, in bytes (default 0/unlimited);
# ## larger messages are dropped
max_message_len = 1000000
#
# ## Maximum messages to read from the broker that have not been written by an
# ## output. For best throughput set based on the number of metrics within
# ## each message and the size of the output's metric_batch_size.
# ##
# ## For example, if each message from the queue contains 10 metrics and the
# ## output metric_batch_size is 1000, setting this to 100 will ensure that a
# ## full batch is collected and the write is triggered immediately without
# ## waiting until the next flush_interval.
# # max_undelivered_messages = 1000
#
# ## Data format to consume.
# ## Each data format has its own unique set of configuration options, read
# ## more about them here:
# ## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_INPUT.md
data_format = "influx"
Whatever I try in brokers with or without port I always receive an error.
Serivce for input inputs.kafka_consumer failed to start: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Can you try with Telegraf 1.12.0-rc1, this plugin has been updated to use a newer version of the Sarama Kafka library. If you still run into problems, run Telegraf with the --debug flag for additional logging.
The version is not selectable currently, however it looks like the default is V0 anyway. Could you open a new issue on the Telegraf Github page for the errors connecting to event hub?