First of all, I'm sorry I'm a complete statistics iliterate, I promise I'm going to get into it after the summer (I need them for my job actually). Because of this I'm not sure about the name of what I'm asking about, I will explain and you tell me what I have to research, study and learn. Although it could look like I'm asking something about programming... I'm not, please read till the end.
I have an application that stores in a table the amount of time that takes a remote server to reply a "hello" every minute. So, basically I have the date and time of the request and the delay.I can open this data and show a nice chart using Flot.
The problem is, when I try to open in the chart all the meassures in the last month, takes a lot of time because there are thousands of elements. I think, what I should do is get a kind of "average" where from those thousand of points, I could get 100 that represent what has happened during the month. I know I will lose precision, but it doesn't matter because the user will be able to do zoom to a specific range.
So, what mathematical function/technique do I have to use to make that reduction?
Thanks a million.