In any packet loss sensor you have a graph which shows downtime,ping time,minimum,maximum and packet loss(%) where the range is from 0 to 1% i need to know how this packet loss is calculated and how it is interpreted? my upper warning limit is 10% and the upper error limit is 20% but in the graph all i see is 0.1 up to maximum of 1% of packet loss
Any explanation please?
Add comment