5
$\begingroup$

From general measure theory, I think it's possible to create a measure space $(C, \mathcal{M_\phi}, m_\phi)$ where $C$ is the middle third Cantor set. Now the measure $m_\phi$ is defined as $m(\phi^{-1}(E))$ for which $\phi$ is the bijection from infinite binary sequences (on the closed unit interval?) to $C$, $m$ is the usual Lebesgue measure, and $E \subseteq C$. Also, $\mathcal{M_\phi}$ is a $\sigma$-algebra as follows: $\{E \subset C: \phi^{-1}(E) \in \mathcal{M}\}$ where $\mathcal{M}$ is the $\sigma$-algebra of Lebesgue measurable sets.

My question is how would you compute an integral in the following form: $\int_C x\;dm_\phi$? I already know that $m_\phi(C)=1$, but how could I go about computing the integral? Would approximation by simple functions work, and if so, what kind of functions should I work with? How would this generalize for $\int_C x^n\;dm_\phi$? Any input would be highly appreciated!

4 Answers 4

6

Definitely the probabilistic way! To wit:

The measure $m_\phi$ is the distribution of the random variable $X=\sum\limits_{n=1}^{+\infty}3^{-n}\xi_n$ where $(\xi_n)$ is i.i.d. and $\xi_n=0$ or $2$ with equal probability. Thus the fact that $E(\xi_n)=1$ implies that $ \int x\text{d}m_\phi(x)=\sum\limits_{n=1}^{+\infty}3^{-n}E(\xi_n)=\sum\limits_{n=1}^{+\infty}3^{-n}=\frac12. $ The same formula without infinite series: a defining property of the distribution of $X$ is the relation 3X=\xi+X', where $\xi=0$ or $2$ with equal probability, X' is distributed like $X$, and $\xi$ and X' are independent. In particular, $3E(X)=E(\xi)+E(X)$ and you are done.

Higher moments can be approached similarly. For every $n\ge1$, 3^nE(X^n)=E((\xi+X')^n)=\sum\limits_{k=0}^n{n\choose k}E(\xi^k)E(X^{n-k}). Furthermore, $E(\xi^k)=2^{k-1}$ hence $ (3^n-1)E(X^n)=\sum\limits_{k=1}^{n}{n\choose k}2^{k-1}E(X^{n-k}), $ which yields the moments of $X$ recursively.

Or, one can center everything at the onset, using 3\bar X=\bar \xi+\bar X', with $\bar X=X-E(X)=X-\frac12$ and $\bar\xi=\xi-E(\xi)=\xi-1$. Some computations become quite simple because $\bar\xi$ is symmetric hence $\bar X$ is symmetric as well and all its odd moments are zero. Furthermore, $\bar\xi=\pm1$ almost surely hence $E(\bar\xi^{2k})=1$ and $E(\bar\xi^{2k+1})=0$. For example, the recursion for the moments of $X$ yields as a recursion for the moments of $\bar X$ the equations $ (3^n-1)E(\bar X^n)=\sum\limits_{k\ge1}{n\choose 2k}E(\bar X^{n-2k})\,[2k\le n]. $ For the first even values of $n$, one gets that $(3^2-1)E(\bar X^2)=1$, $(3^4-1)E(\bar X^4)=6E(\bar X^2)+1$ and $(3^6-1)E(\bar X^6)=15E(\bar X^4)+15E(\bar X^2)+1$, hence $ E(\bar X^2)=1/8,\quad E(\bar X^4)=7/320,\quad E(\bar X^6)=205/46592. $

  • 0
    @Didier: Haha, good point!2011-08-16
4

If I understand you correctly, the measure of the lower third is $1/2$ and that of the upper third is $1/2$, and that of the lower third of the lower third is halve of that of the lower third, and so on.

That is a probability measure, and the integral is the expected value---just an average value. By symmetry, the average is $1/2$. I don't think it would be hard to make the symmetry argument logically precise (e.g., maybe starting with $u = 1-x$, etc.).

See this section on moments of the Cantor distribution: http://en.wikipedia.org/wiki/Cantor_distribution#Moments

To generalize to higher moments, I think this might be relevant: http://en.wikipedia.org/wiki/Law_of_total_cumulance This appeared in:

David Brillinger, "The calculation of cumulants via conditioning", Annals of the Institute of Statistical Mathematics, Vol. 21 (1969), pp. 215–218.

It is a natural generalization of the law of total expectation and the law of total variance.

This article may be enlightening if you don't already know this stuff: http://en.wikipedia.org/wiki/Cumulant

It's easier to work with cumulants and then find moments than vice-versa, for reasons that that article may make clear.

Later note: Let's be a bit more "probabilistic" in the way the substitution suggested above is viewed. Use capital $X$ for the random variable whose probability distribution is the measure proposed here. Let $U=1-X$. Then $E(U) = 1 - E(X)$, but since $X$ and $U$ both have the same distribution, we also have $E(U) = E(X)$.

3

Getting more concrete: It's proved in one of the cited Wikipedia articles that the variance of the Cantor distribution is $1/8$. Recall that the expected value of the square is the sum of two terms: the square of the expected value, and the variance. The square of the expected value is $(1/2)^2 = 1/4$. The variance is $1/8$. Hence $\int_C x^2\;dm_\phi = 1/4 + 1/8 = 3/8.$

By symmetry, the third central moment is 0, so we have $ \begin{align} 0 & = \int_C \left(x - \frac12\right)^3\;dm_\phi = \int_C x^3 - \frac32 x^2 + \frac34x - \frac18 \;dm_\phi \\ & = \int_C x^3\;dm_\phi -\frac32\cdot\frac38 + \frac34\cdot\frac12 - \frac18 = \int_C x^3\;dm_\phi -\frac{5}{16}. \end{align} $

To apply the law of total cumulance to find the 4th moment $\int_C x^4\;dm_\phi$ might be a lot more work than that. One would need to list all 15 partitions of a set of four members, and that's barely the first step.

3

And finally, I find a published paper on this very question:

http://www.sciencedirect.com/science/article/pii/0167715292900398

  • 1
    Too bad I can't download it for free!2011-08-16