I have the following scenario: IoT devices send data to an azure iot hub, the telegraf agent collects the data and sends it to a bucket in the influxdb cloud 2.
The problem that I face is that telegraf seems to connect to azure iot hub and collect the data (because I see in the monitoring spikes of outgoing data), but not sending it to the influx db cloud.
[[outputs.influxdb_v2]]
## The URLs of the InfluxDB cluster nodes.
##
## Multiple URLs can be specified for a single cluster, only ONE of the
## urls will be written to each interval.
urls = ["https://westeurope-1.azure.cloud2.influxdata.com"]
## Token for authentication.
token = "XXXXItsAScrect"
## Organization is the name of the organization you wish to write to; must exist.
organization = "XXXX@XXXX.com"
## Destination bucket to write into.
bucket = "paul"
[[inputs.eventhub_consumer]]
## The default behavior is to create a new Event Hub client from environment variables.
## This requires one of the following sets of environment variables to be set:
##
## 1) Expected Environment Variables:
#EVENTHUB_NAMESPACE = ""
#EVENTHUB_NAME = ""
#EVENTHUB_CONNECTION_STRING = ""
##
## 2) Expected Environment Variables:
## - "EVENTHUB_NAMESPACE"
## - "EVENTHUB_NAME"
## - "EVENTHUB_KEY_NAME"
## - "EVENTHUB_KEY_VALUE"
## Uncommenting the option below will create an Event Hub client based solely on the connection string.
## This can either be the associated environment variable or hard coded directly.
connection_string = "Endpoint=sb://iothub-ns-iotpaulshu-6039161-c0b5109dc8.servicebus.windows.net/;SharedAccessKeyName=iothubowner;SharedAccessKey=XXXItsASecret;EntityPath=iotpaulshub"
## Set persistence directory to a valid folder to use a file persister instead of an in-memory persister
# persistence_dir = ""
## Change the default consumer group
# consumer_group = "telegraf"
## By default the event hub receives all messages present on the broker, alternative modes can be set below.
## The 3 options below only apply if no valid persister is read from memory or file (e.g. first run).
# from_timestamp =
# latest = true
## Set a custom prefetch count for the receiver(s)
# prefetch_count = 1000
## Add an epoch to the receiver(s)
# epoch = 0
## Change to set a custom user agent, "telegraf" is used by default
# user_agent = "telegraf"
## To consume from a specific partition, set the partition_ids option.
## An empty array will result in receiving from all partitions.
# partition_ids = ["0","1"]
## Max undelivered messages
# max_undelivered_messages = 1000
## Set either option below to true to use a system property as timestamp.
## You have the choice between EnqueuedTime and IoTHubEnqueuedTime.
## It is recommended to use this setting when the data itself has no timestamp.
# enqueued_time_as_ts = true
# iot_hub_enqueued_time_as_ts = true
## Tags or fields to create from keys present in the application property bag.
## These could for example be set by message enrichments in Azure IoT Hub.
# application_property_tags = []
# application_property_fields = []
## Tag or field name to use for metadata
## By default all metadata is disabled
# sequence_number_field = "SequenceNumber"
# enqueued_time_field = "EnqueuedTime"
# offset_field = "Offset"
# partition_id_tag = "PartitionID"
# partition_key_tag = "PartitionKey"
# iot_hub_device_connection_id_tag = "IoTHubDeviceConnectionID"
# iot_hub_auth_generation_id_tag = "IoTHubAuthGenerationID"
# iot_hub_connection_auth_method_tag = "IoTHubConnectionAuthMethod"
# iot_hub_connection_module_id_tag = "IoTHubConnectionModuleID"
# iot_hub_enqueued_time_field = "IoTHubEnqueuedTime"
## Data format to consume.
## Each data format has its own unique set of configuration options, read
## more about them here:
# data_format = "influx"
data_format = "json"
Hello @goodvirus,
To debug telegraf I recommend using ./telegraf -config telegraf.conf -test
or writing data to stdout
[[outputs.file]]
## Files to write to, "stdout" is a specially handled file.
files = ["stdout"]
## Data format to output.
## Each data format has its own unique set of configuration options, read
## more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_OUTPUT.md
data_format = "influx"
To check if your line protocol is correct and then going from there. Can you share your lines?
Dez 18 08:38:54 goodvirus telegraf[1081]: 2020-12-18T07:38:54Z W! [outputs.influxdb] Metric buffer overflow; 44 metrics have been dropped
Dez 18 08:38:54 goodvirus telegraf[1081]: 2020-12-18T07:38:54Z E! [outputs.influxdb] When writing to [localhost:8086]: Post "localhost:8086/write?db=telegraf": dial tcp 127.0.0.1:8086: connect: connection refused
Dez 18 08:38:54 goodvirus telegraf[1081]: 2020-12-18T07:38:54Z E! [agent] Error writing to outputs.influxdb: could not write any address
So it seems that the agent wants to write my localhost and ignoring my [[outputs.influxdb_v2]] specified url: urls = ["https://westeurope-1.azure.cloud2.influxdata.com"]
I also tried running the following command with the same result: telegraf --config telegraf.config https://westeurope-1.azure.cloud2.influxdata.com/api/v2/telegrafs/06be34c13cff6000
thank you very much for your help!
Funny think is, when I take the same output configuration and a nother input (for example cpu usage) it works, so it has to do something with the event hub plugin. For example data I´m using the Raspberry Pi Azure IoT Web Simulator as specified in the example for the plugin.
Just an Update, now I reinstalled telegraf and reddit the configuration. Now I don’t get the error message about the localhost, but I still don#t get any data send.
I don’t think I was very clear, can you please include
[[outputs.file]]
## Files to write to, "stdout" is a specially handled file.
files = ["stdout"]
## Data format to output.
## Each data format has its own unique set of configuration options, read
## more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_OUTPUT.md
data_format = "influx"
In your configuration so that we can see what the line protocol looks like before you send it to Influx? I expect to see your line protocol in stdout.
You were clear, and I did what you ask for, but there is nothing showing (with journalctl - f or I also tried telegraf on windows, nothing in the logs)
But I can see in azure that the messages are going out
@Anaisdg: No same problem as before and I´m out of ideas…
I see outgoing Traffic, and telegraf says its connected to the event hub and influxdb cloud but it seems to collect the messages but not processing them.
Would it help if I message you the configuration with the secret keys so you could try it out yourself?
I fixed to issue. The problem was that the input from the event hub was json and telegraf couldn’t prase it so it throw it away without a message. Solution is in the github issue.