My question is part math part computers but I need more of the math answer then a computer answer. I am going to be grabing snapshots of a moment of time on how a computer is performing. Because of the way a computer works I can have data that differes largerly For example, i can take a snapshot at on second of 100% computer usage and then the following snap shot is 20%. I am planning on taking these snapshots ever 1-2 seconds, now If i take an average of these numbers i can loose some information. if the system is 100% for 5 intervals but is then 5% for the remaining 55 intervals, I would have an average of 12.91% which does not really show that for 5 intervals I was at 100%. I am not sure if there is a way that i can respecent something like an average but also express this and able to also plot it on a graph
Generating statistics from data samples, that are of snapshots.
1
$\begingroup$
statistics
graphing-functions
average
sampling
1 Answers
0
You could do a histogram or a box plot. The histogram gives more information (and you can , for example, include a line for the mean) but the box plot is more compressed and communicates a few pieces of information more clearly. If you're plotting a moving average over time you can do a sort of moving box plot by plotting the mean/median, first and third quartiles, and max and min (say over the last 60 seconds) over time.