Log file aggregator centralize log server

I have 3 servers that generate log file daily with size about 12GB (12*3=36GB)

How can I use telegraf to gather these files on centralize log server.



This depends on a number of factors :slight_smile: for example, what kind of log files (text file, gzip), what is the data format (xml, json, logging format), and how often the logs are updated (constantly vs all at once). Additionally, what is your goal for an output? Are you trying to load these into InfluxDB? or some other output?

A general method using telegraf is to use the tail input plugin combined with one of the parsers.

If you are sending these logs to InfluxDB then I find logs of this size are better suited for parsing and sending using the InfluxDB client libraries. These require some code from your side, but you can more easily bulk read and import log files using it.

4-want aggregate raw files on centralize server
5-is it logical to load raw data into the influxdb?
6-how exactly tail track files on cliens? Is it require to define exactly what send to server on server side?
7-i’m looking for a simple and robust solution.

Any idea?

4-want aggregate raw files on centralize server

What do you plan on doing with the aggregated files?

Telegraf is a metric collection tool. It can collect from systems, logs, services, etc. The primary use case is to generate metrics based on what it is reading from and push those metrics up to an output, like InfluxDB.

I am still not sure Telegraf is what you are looking for.

Archive raw files and index them via telegraf in this scenario.