0
$\begingroup$

We observe a series of random i.i.d. variables, $X_1$, $X_2$, etc., all with the same mean $M$.

If we define $A_n$ as the average of the first $n$ observations, then we know, by the law of large numbers, that it converges to $M$ as n goes to infinity.

But what if we keep a moving average (correct me if this is not the correct term), according to this formulas:

$A_1 = X_1$

$A_{n+1} = p A_n + (1-p) X_{n+1}$

Where:

$0 < p < 1$

Intuitively, it seems that $A_n$ will also converge to $M$, but I couldn't prove or disprove this.

3 Answers 3

5

No, it's not true. $A_n = (1-p) X_n + p(1-p) X_{n-1} + \ldots + p^{n-2}(1-p) X_2 + p^{n-1} X_1$. A change of $1$ in $X_n$ will always produce a change of $1-p$ in $A_n$. So $A_n$ will not approach a constant unless all the $X_n$ are constant almost surely.

  • 0
    Well... my int$u$ition was wrong with probability 1.2012-12-24
2

No, this will almost surely not converge to $M$. To do so, it would have to be a Cauchy sequence; but for any sufficiently small positive $\epsilon$ and any $n$ there is a non-zero probability for $|A_{n+1}-A_n|\gt\epsilon$, so the probability that that inequality will eventually hold at least once is $1$ for sufficiently small $\epsilon$.

1

If the variance of $X_i$ is $\sigma^2$ then the variance of the mean of the first $n$ terms is $\sigma^2/n$ which reduces towards $0$ as $n$ increases.

For your new expression the variance of $A_n$ increases towards $\sigma^2/(1-p)$.