The Dutch Government has a website for some Open Data from which you can download the data as a JSON file. For example: JSON URL. Now I’m not sure what tool to use to make a nice graph of this data. The JSON is always a complete list of all the data.
Is it better to use telegraf to grab the data from the site once a day or can I just use Grafana for this? Do I then have to drop the table every-time I load the data since it is always the complete data set?
Was looking at the http.inputs, but don’t understand how to tell it to dump it in the influxdb:
I have not looked at the URL and data from the website you mentioned.
But apart from that: If you have tapped the data with Telegraf with the http.inputs plugin, you also need an output plugin. The data has to go somewhere.
If you want the data to go into an InfluxDB, you also need the outputs.influxdb (for 1.x.x influx database) or outputs.influxdb_v2 (for 2.x.x influx database) plugin in Telegraf.
I have no experience with Grafana, but there are plugins for Grafana for json. With a bit of luck you can configure it so that the Grafana plugin can directly access the URL with the json data. Then you wouldn’t need InfluxDB at all. But this is just an idea, I don’t know if this could work. Maybe ask in the Grafana forum?
I don’t know exactly what you have in mind, but I would say yes.
Because you would be loading the same data over and over again into the same database!?
I already had an output configured for my vSphere data I’m collecting using telegraf and I now see the JSON is written in influx as well. But I notice that every 10 sec a new record is written with just the last line of the data in the JSON on the external website. Also I noticed that in influxdb I ONLY have the last record from the JSON on the external website (1700+ times).
My questions:
how can I change the polling interval since this is only updated once a day?
What would be best: import the complete JSON once by hand or automatically read all the data every time again but then I would first have to drop the database before reading the external JSON.
I didn’t quite understand it, but I suspect a problem with the timestamp.
There must be a timestamp in the json and it must also be configured where to find it.
Maybe the format of the timestamp doesn’t fit either.