I'm having an issue with a large amount of time stamped data for work. Each data point has a value and a time stamp.
We are going to be displaying this data on a graph, but we need a way to reduce the amount of points in the series due to the restriction on visible points on the graph.
Eg. A week time range will return approximately 5000 individual points, but the graphing component can only show up to 450 points in a series at a time (1 point per pixel).
So this means we need to simplify the amount of data in the series. We don't want to average points, but keep the raw values, just remove a certain number to get it underneath the 450 point maximum.
These series tend to be sawtooth waveforms.
We investigated the DouglasPeucker, but I wasn't sure how to apply it.
Are there any other algorithms or suggestions on how this can be accomplished?
EDIT: This is an example of a typical sawtooth waveform.