Hi
I have lots of log lines like this in exact same time, when i try to use logstash to pars and send to influxdb2, influx or ligstash aggregates some lines!
e.g here is the sample lines that aggregate is I[847676]
2023-11-20 14:05:49:787 INFO T[SHR1I7783] APP R[GW] I[675940] U[461301]
2023-11-20 14:05:49:787 INFO T[SHR5I7787] APP R[GW] I[847676] U[432408]
2023-11-20 14:05:49:787 INFO T[SHR5I7787] APP R[GW] I[847676] U[935351]
2023-11-20 14:05:49:787 INFO T[SHR5I7787] APP R[GW] I[847676] U[533755]
2023-11-20 14:05:49:787 INFO T[SHR0I7782] APP R[GW] I[50888] U[600770]
2023-11-20 14:05:49:787 INFO T[SHR5I7787] APP S[GW] I[847676] U[432409]
1-I try to set I as field but still aggregate them.
2-try to set I as tag, it work but it will produce performance issue due high cardinality.
3-try to use uid, for each even still aggregate them.
4-U is quietly unique field, i try to use that but still aggregate datapoints.
5-if i want use telegraf instead of logstash what is the correct configuration that work with patter that i mentioned?
FYI1: don’t want use metric because it is important to group by T,I,R,S
FYI2: R means received, S means send
Hmm if you are open to this and bypassing logstash completely, that might be the best way to go. (I can’t see any logstash to influxdb plugins that appear to be maintained currently.)
Looks like the telegraf tail plugin + grok filter is the current best method in the influx ecosystem. Take a look here.