I tried searching around for this but it was difficult to boil down the search terms. Plus nothing seemed to be showing up anyway.
What's an easy way to show that the average percentage increase of n numbers will not equal the total percentage increase of the before/after summed n numbers? Yes, one could work out an example but it still doesn't make it apparently clear.
I worked out a small example to illustrate my intuition.
Initial % Final 10 1 10.1 12 2.5 12.3 11 2 11.22 Inital Sum = 33 Final Sum = 33.62 Average % = (% Sum)/ # of %'s = (1+2.5+2)/3 = 1.833 Total % Increase = (Final Sum - Inital Sum) / (Initial Sum) * 100 = (33.62 - 33)/33 * 100 = 1.87879
These percentages are close but not the same. My intuition tells me this is correct but I would like to develope a proof to show this is actually true.
Any help/guidance would be greatly appreciated.
-- Edit 1 --
My question contains an incorrect premise. I shouldn't be using the average of the percentages. Rather, I should be using the weighted average.
Using Excel and this example I was able to generate the total percent increase using the all the percentages.
A proof would still be helpful in understanding the problem.
-- Edit 2 --
Given initial values $a_k$ and percentages $p_k$, the weighted average for the percentages would be: $ \frac{\sum_{k=1}^{n} a_k *p_k}{\sum_{k=1}^{n}p_k} $
Hopefully that's the correct notation. Like I stated in Edit 1, that was pulled from How to calculate weighted averages in Excel.