1
$\begingroup$

I would like to know if certain identities for averages (mean) also hold for the exponential moving average (EMA). I can verify the mean case, but not the exponential case. Can somebody tell me if the same basic identities hold?

$\begin{align*} \mathrm{mean}(A_i + B_i) &= \mathrm{mean}(A_i ) + \mathrm{mean}(B_i)\\ \mathrm{mean}(R * A_i) &= R* \mathrm{mean}(A_i) \end{align*}$ Now, for an EMA we have the basic formula: $\mathrm{EMA} = S_i = S_{i-1} + \alpha * (A_i - S_{i-1})$ Where $\alpha$ is a weighting factor $(0 < \alpha < 1)$.

So the question is, "are the following also true?": $\begin{align*} \mathrm{EMA}(A_i + B_i) &= \mathrm{EMA}(A_i) + \mathrm{EMA}(B_i)\\ \mathrm{EMA}(R *A_i) &= R * \mathrm{EMA}(A_i) \end{align*}$ For the second case it seems to hold based on a quick analysis (as I can completely factor out $R$). But for the first case I'm not so good with factoring the recursive relationship, so I don't know.

  • 0
    @Chris: you are right. I was thinking of mean, median, mode, etc.2011-05-26

2 Answers 2

1

Yes, you start with $ema(A_1+B_1)=ema(A_1)+ema(B_1)$. Then consider what happens when you add another term. Nothing disturbs the relation, so it continues. This can be formalized by induction.

Added: to show the induction, assume it is true for $n$ terms. I will use $ema(A_n)$ to mean the exponential moving average of the first $n$ terms of $A$. So $ema((A+B)_n)=ema(A_n)+ema(B_n).$

$ema((A+B)_{n+1})=ema((A+B)_n)+\alpha((A_{n+1}+B_{n+1})-ema((A+B)_n))$

$=ema(A_n)+ema(B_n)+\alpha (A_{n+1}-ema(A_n))+\alpha (B_{n+1}-ema(B_n))=ema(A_{n+1})+ema(B_{n+1})$

where the first equality is the definition of $ema$, the second uses the induction hypothesis and the distributive law, and the third reuses the definition of $ema$.

  • 0
    I think I've got it now.2019-01-03
0

To add to the post answer (nailed by Ross), notice that the exponential moving average (also called recursive average, AR(1) filter, one-pole smoothing, etc) can also be expressed as

$\displaystyle s_i = \beta \; s_{i-1} + \alpha \; a_i \;\;\; $ with $\beta = 1 - \alpha $

which is computationally equivalent (except perhaps for numerical stability), but conceptually more enlightening: it shows that the output is a weighted average of the current input and the previous output.