I have a streaming app, that generates statsd like statistics. Every couple minutes it will probably generate a data set around 50,000 rows. Its essentially, a bunch of tags and a couple values with a timestamp, so easy to convert to influx format.
My question is, whats the best way to get this to influx… i figure my options are :
- Just send unbatched web requests across the network (probably too slow)
- Send batches of 5000 across the network
- Send UDP messages to telegraf (1 per row) and let telegraf deal with batching (can it keep up?)
- copy everything to a file and use the -import command for influx
Is there best practice for this?