How to parse mix json logs using grok or any other

I am very new to TICK please help me with this, here are sample logs that I want to parse by using telegraf.there are two types of logs that I am getting from the same log file

2019-02-03 23:52:11,940 | [V4,0833364F,533.330,0,0,533.330,0,0,0,0,-0.849,0,0,-0.849,628.064,0,0,628.064,432.013,
431.847,433.645,430.547,249.423,247.824,251.089,249.355,1.055,0,0,3.164,49.975,38982176.000,0
,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,19/02/03,23:52:12]

2019-02-03 23:52:11,950 | {“MACID”:“8550815F”,“ID”:“1”,“SS”:“16”,“FW”:“V5.1.11”,“TSRC”:“R”,“SN”:“29370”,“PCK”:{“M28”:“AQPI6H9GvTkpRnbaJ8aO9Pm/JTnoQ9QPU0N18aZCBEurQkhsdkYU+jJFwDq0xd9sQ78m6xhD0v+hQ3QWqEIbljNF5V16RaoOfcWX9xW/PfNsQ9Pm/UNz+W9B8Ca6Re0apkWBH2vFxV2BvwvPM0PVR1pDdoMlQfbR5k9WhEJPVGdYTF7Ras0jAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH/1kkB/7/pAAAAAAAAAE0o9AEDGC2nP+tVrz/12WA/3sAAxHnAAMR5wADEeXylfp1cV6T4”},“RTC”:“19/02/03,23:51:57”}

Thank you please help me to parse the logs

You should be able to use the grok parser with whatever input you use to consume the logs. My assumption is you would define multiple grok_patterns.

eg: grok_patterns = ["%{LOGFMT_1}", "%{LOGFMT_2}"]

thanks for your valuable reply, is there any conditional support here like if elseif else kind of thing,because i am getting mutiple types of logs from same source file and the values inside log are not fixed means the positions of values are not fixed for example


[M1,V4.1.9,0822284F,537272.063,499854.906,121809.109,
0.930,11770.710,6786.713,26.398,50.073,187234.688,
176989.594,23256.539,0.945,11725.495,6798.036,
27.542,164736.391,152346.094,41038.895,0.925,11782.277,
6759.561,24.371,185578.172,170519.484,57513.629,0.919,
11804.357,6802.546,27.281,2540343840.000,2432279360.000,
526516192.000,-1170295.375,0,0,0,0,0,0,0,19/02/03,23:52:17]


[M2,V4.1.8,0833174F,0,0,0,1107617.250,0,0,19/02/03,23:58:41]


[M10,1885517F,0,0,0,0,0,0,18.817,19/02/04,00:00:11]


[M14,v4.5.3,9397282F,17.155,6.900,4.726,5.527,
-90.457,-30.303,-30.706,-29.450,-0.185,-0.221,
-0.151,-0.183,91.901,30.931,31.027,29.970,0,421.129
,418.544,421.021,242.621,243.263,242.224,242.377,0,
127.739,128.246,123.559,50.069,5412456.000,7057992.000
,91584.000,2396025.250,0,0,0,0,0,0,0,0,4.288,0,0,0,0,0,0,0,0,0,1549239797]


[M19,GV4.8.2,4205577F,1308582.000,-623884.188,638114.375,46583.328,38661.109,-343905.688,-366844.188,749411.000,-0.999,
-0.876,0.867,-0.062,1309153.000,712392.188,736046.625,750857.375,437.104,436.600,437.758,436.955,252.388,253.133
,252.109,251.921,2908.599,2823.226,2921.755,2980.815,50.076,4480122880.000,4480352320.000,16744643.000
,-14671176.000,0,0,0,0,1.000,8823884.000,0,0,176.197,5.276,4.354,5.362,2.964,2.855,2.707,1297101.375,0
,1449575.500,19/02/03,23:55:52]

2019-02-03 23:59:02,939 | [S1,V4,0360520F,INVALID DATAMAINS_SUPPLY,25982]


2019-02-03 23:58:11,544 | [S1,OTA,V1.4,7172877F,SOFT RESET,9273]


[V2,7365462F,339.50156, 0.34368,987.82996,456.14764,310.89011,287.22711,416.81348,423.88892,412.90961,413.64191,240.81787,242.50154,243.31425,236.63782, 1.64484, 2.01518, 1.50014, 1.41919,50.08651,206997568.00000]

[V4,9449478F,16358.686,5813.755,-5231.535,-5313.388,0,0,0,0,-0.997,0.810,0.941,
0.827,16409.059,7178.353,5561.770,6422.307,410.950,411.008,412.751,409.092,237.264,236.027,238.177,
237.586,31.704,35.247,29.400,30.466,50.025,20354322.000,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,19/02/03,23:51:35]


2019-02-03 23:53:19,765 |
{8569904F,25460.49023, 0.49105,161.10310,107.31907,50.10833,7921411840.00000}


2019-02-03 23:52:33,259 |
0639719F,V4, INVALID DATA,19/02/03,23:52:22

2019-02-03 23:55:45,932 | 0639719F,V4, SOFT RESET,14525

2019-02-03 23:52:11,950 | {“MACID”:“8550815F”,“ID”:“1”,“SS”:“16”,“FW”:“V5.1.11”,“TSRC”:“R”,“SN”:“29370”,“PCK”:{“M28”:“AQPI6H9GvTkpRnbaJ8aO9Pm/JTnoQ9QPU0N18aZCBEurQkhsdkYU+jJFwDq0xd9sQ78m6xhD0v+hQ3QWqEIbljNF5V16RaoOfcWX9xW/PfNsQ9Pm/UNz+W9B8Ca6Re0apkWBH2vFxV2BvwvPM0PVR1pDdoMlQfbR5k9WhEJPVGdYTF7Ras0jAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAH/1kkB/7/pAAAAAAAAAE0o9AEDGC2nP+tVrz/12WA/3sAAxHnAAMR5wADEeXylfp1cV6T4”},“RTC”:“19/02/03,23:51:57”}

-----------------------------------------------------------------------------------------------------------------------above all are the log types that i am getting from same source file and i wanted to parse these logs by fieldwise,but the issue is values (like “MACID”:“8550815F,0639719F,etc) are not in fixed positions in log.
my use case is i want to parse the logs in such way that i should be able to apply aggregation on every field if not possible particular fields like macid which is comming in all types of logs value ending with F like 8569904F,0639719F,8569904F…etc similary for other values.

please help me with this.thanks in advance.