Hi,
I have telegraf version 1.5.2 at ubuntu 14/16 nodes with logparser plugin enabled.
[[inputs.logparser]]
files = ["/var/log/freeradius/radius.log"]
from_beginning = false
[inputs.logparser.grok]
patterns = ["%{TS_ANSIC:ts:ts-ansic} : Auth: \\(%{NUMBER:number:drop}\\) %{DATA:status}:%{GREEDYDATA:greedy:drop} cli %{IPV4:ip}"]
measurement = "freeradius_log"
custom_patterns = '''
TS_ANSIC %{DAY} %{MONTH} \s?%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{YEAR}
'''
Plugin parse data from the beginning though from_beginning = false
is set
I tried to DROP MEASUREMENT freeradius_log
. But after some time I again can see old data in
SELECT * FROM freeradius_log
.
I wouldn’t care if I didn’t have consequences.
Here is my tick script
dbrp "telegraf"."autogen"
var data = stream
|from()
.measurement('freeradius_log')
.groupBy('host')
|window()
.period(10m)
.every(5m)
var total = data
|count('status')
var error = data
|where(lambda: "status" != 'Login OK')
|count('status')
error
|join(total)
.fill(0)
.as('errors', 'totals')
|eval(lambda: 100.0 * float("errors.count") / float("totals.count"))
.as('value')
|alert()
.id('{{ .TaskName }}')
.crit(lambda: "value" > 30)
.message('Value: {{index .Fields "value"}}, {{ index .Tags "host" }}')
.slack()
After I enable it I get spam to slack channel like this
Value: 50, example.com
Value: 0, example.com
Value: 100, example.com
Value: 0, example.com
Value: 50, example.com
Value: 50, example.com
Value: 0, example.com
Value: 50, example.com
Value: 0, example.com
Value: 100, example.com
Value: 50, example.com
Value: 100, example.com
Value: 100, example.com
Value: 100, example.com
There are many notifications per second.
And for example.com node in db I have continuous “Login OK”.
What can be wrong?