Understanding window() and sigma()

Hello all,

I am new here and exploring the TICK stack for performance monitoring of my systems. I have InfluxDB and telegraf installed and working fine. Next step is to get kapacitor to generate alert based on telegraf data. I have gone through the TICKScript docs and I kind of understand it.

As a start I am toying with the cpu_alert_stream.tick from examples. Installed it and enabled it. While testing using while true; do i=0; done to generate cpu load, no alerts are getting generated in /tmp/cpu_alert_log.txt even after running the script for about half hour.

Before debugging this, I need to fully understand what the script does. I guess the parts I am missing are the usage of window() and the sigma() function.

Does window() collect all points for the .period() and output as a batch or is it a scaling window where for every input it gets, outputs all points upto last .period()? I know I am wrong here, but just trying to understand this.

I am assuming mean() takes a bunch of points and outputs the mean value of it, and sigma() calculates the standard deviation. How many points does sigma() keep in memory when computing the SD?

Enough questions for now :slight_smile: its just a reflection of my confused mind!

Hoping to get my alerting working as I expect soon!