Send data between local buckets using Telegraf

Hello!

Newbie to InfluxDB here.
I have a raspberry pi that collects accelerometer data via a python program and sends it to a bucket in InfluxDB, let’s call that bucket “gyro_data”. This bucket will fill up with a lot of data since the sample rate is around 200+ Hz. This data is going to be sent to the cloud later, but to reduce the amount of data I want to filter the data down to the mean value every second instead before sending it to the cloud.
So I was thinking I could use a second local bucket which stores this data before it is being sent to the cloud, lets call that bucket “filtered_gyro_data”.

Now here is the problem. I can not figure out how to send the data between these buckets.
I have set up Telegraf on my raspberry pi and configured the outputs section, it is currently running fine. This is what the configuration looks like for the outputs section:

# # Configuration for sending metrics to InfluxDB 2.0
 [[outputs.influxdb_v2]]
#   ## The URLs of the InfluxDB cluster nodes.
#   ##
#   ## Multiple URLs can be specified for a single cluster, only ONE of the
#   ## urls will be written to each interval.
#   ##   ex: urls = ["https://us-west-2-1.aws.cloud2.influxdata.com"]
   urls = ["http://<pi's ip here>:8086"]
#
#   ## Token for authentication.
   token = "<valid token here>"
#
#   ## Organization is the name of the organization you wish to write to.
   organization = "test"
#
#   ## Destination bucket to write into.
   bucket = "filtered_gyro_data"
#
#   ## The value of this tag will be used to determine the bucket.  If this
#   ## tag is not set the 'bucket' option is used as the default.
etc.....

I was googling around and found out that I may be required to configure a inputs section as well to actually read the data from the source bucket. But whenever I did that Telegraf would not start because of some problem with the configuration file (assuming because of syntax error).

Here is also some other information if needed: InfluxDB v. 2.6.1, Telegraf v. 1.26, Pi is running Debian 11. The configuration file on my pi is being edited, Telegraf configuration in the InfluxDB web interface is not being used.

This feels like an easy task to accomplish, but with my limited knowledge so far I just can’t get it to work. Anyone have some hints or suggestions on how I can solve this problem? I really appreciate all help! :blush:
Thanks in advance!

Hi,

Telegraf does not have any input plugin to read data from InfluxDB currently.

Depending on the version of InfluxDB, if your have access to tasks, you might look at this post for example of downsampling data and copying the new data to a new bucket.

Otherwise, you might consider having two influxdb outputs in your original config, one with the raw data and another that downsamples.

Another option is to use the client libraries to read the data out and do whatever manipulation of the data you want, and then write it back into a new bucket.

@tobhag there are few ways to achieve what you want IMO:

  1. Use the basicstats aggregator, compute your downsampled data directly and add add a bucket tag (choose a name wisely) set to filtered_gyro_data. Then, in the influxdb output, use the bucket_tag = "bucket" to send the aggregated data to the filtered_gyro_data bucket.
  2. Do the downsampling as described in 1. but directly send the aggregated data to InfluxDB cloud by adding another influxdb_v2 output plugin. You should use some metric-filtering e.g. tagpass in combination with adding a custom tag in the aggregator.
  3. Use InfluxDB’s tasks to do the downsampling in the database itself.

Hope that helps!?

Hello!

Thank you both for your answers, I have kind of forgotten about this post.
Since my last post I found the solution by using tasks, which you both mentioned.
This was the easiest way for me to do it, and I even managed to set up a task that sends the filtered data to the cloud.

My problem is solved :slight_smile:
Thank you!

1 Like