Certain nodes in our cable plant get hit with DDoS attacks (typical for cable modem customers). This causes the data sets for those downstream channels to often throw off reports we run, causing them to show negative when they shouldn't.
Basic calculation we use in a factory to generate these reports:
|304000 - (node1+...+node8)|
For an eight channel modem. Theoretical maximum is 42Mb per channel, but that's never the case in a real world cable plant.
We could use 42Mb and it would most likely fix this issue, but we'd rather not since we want to see what our average, which is 38Mb per channel.
Is it possible to remove outliers from data sets to prevent this from happening?