Hi,
the goal → parsing a postfix logfile, looking for failed logins (lines contains always a ‘SASL’), put a “counter” into influx and make a grafana dashboard with failed logins in the last 5 minutes.
Example of the logfile
Jan 25 18:30:47 mail postfix/smtpd[60487]: warning: unknown[80.94.95.228]: SASL LOGIN authentication failed: authentication failure, sasl_username=maurice@domain.tld
What i got so far is:
[global_tags]
[agent]
interval = "10s"
hostname = "mail.domain"
round_interval = true
flush_interval = "10s"
flush_jitter = "0s"
collection_jitter = "0s"
metric_batch_size = 1000
metric_buffer_limit = 10000
quiet = false
debug = false
omit_hostname = false
[[outputs.influxdb]]
urls = ["https://10.0.0.50:8086"]
database = "experimental"
username = "experimental"
password = "secret"
insecure_skip_verify = true
timeout = "0s"
retention_policy = ""
skip_database_creation = true
[[outputs.file]]
files = ["stdout", "/tmp/metrics.out"]
data_format = "influx"
[[inputs.postfix]]
queue_directory = "/var/spool/postfix"
[[inputs.tail]]
files = ["/var/log/maillog"]
from_beginning = false
name_override = "postfix_log"
grok_patterns = ["%{CUSTOM_LOG}"]
grok_custom_patterns = '''CUSTOM_LOG %{SYSLOGTIMESTAMP:timestamp} %{SYSLOGHOST} %{DATA:program}(?:\[%{POSINT}\])?: %{GREEDYDATA:message}(?<message>(.*SASL.*))'''
data_format = "grok"
fieldinclude = ["message"]
#[[processors.strings]]
# [[processors.strings.replace]]
# tag = "message"
# old = ".*"
# new = "1"
[[processors.regex]]
namepass = ["postfix_log"]
[[processors.regex.tags]]
key = "message"
pattern = ".*"
replacement = "1"
The [[input.tail]] part works, but i don’t want to bloat the database with all the textstuff and simply replace all the text into a “1” which would also making it much easier to work within grafana.
Any help is very much appreciated
Gav