I know how to calculate a line of best fit with a set of data.
I want to be able to exponentially weight the data that is more recent so that the more recent data has a greater effect on the line.
How can I do this?
I know how to calculate a line of best fit with a set of data.
I want to be able to exponentially weight the data that is more recent so that the more recent data has a greater effect on the line.
How can I do this?
Hmm, you make it much general, by just saying "exponential". So just a general answer:
Define $d_i=now-time_i$ the time-difference of the i'th data-point to "now". If $d_i$ can be zero, add one: $d_i=1 + now - time_i$
Then use the concept of "weighting" for each datapoint: assume a weight $w_i = \exp(1/d_i) $ which is handled as $w_i$'th multiplicity of that datapoint. Unweighted, each datapoint occurs in the correlation/regression formula with multiplicity $1$ and, for instance the number of cases N is just the sum $ N = \sum_{i=1}^n 1 $ . Weighting means here to replace N by W: , $ W = \sum_{i=1}^n w_i $ and analoguously in the computation of mean, variance and covariance include the weight instead of "1" in the formulae.
(While I'm writing this I just see, that the was an answer of Ross crossing, so this may be redundant meanwhile...)
Most linear least squares algorithms let you set the measurement error of each point. Errors in point $i$ are then weighted by $\frac{1}{\sigma_i}$. So assign a smaller measurement error to more recent points. One algorithm is available for free in the obsolete version of Numerical Recipes, chapter 15.