Can you please help verify if what I have done is correct for the question below? I applied Chebyshev's theorem in the first step, but I am worried if there are any mathematical errors or misapplied theorems in my solution. Thanks for the help.
Convergence in probability of Sn / n
2
$\begingroup$
probability
statistics
-
1$\ni^2$ is definitely a new one for me... – 2011-10-17
-
0All of the $\ni$'s should probably have been $\varepsilon$. – 2011-10-17
-
0I don't know why you have $-1$ in the numerator after $\operatorname{var}(S_n)$. All you really need to know about $\operatorname{var}(S_n)$ is that it is finite. That can be deduced from the fact that $E(X^4)<\infty$. You seem to have done everything else. – 2011-10-18
-
0This application of Chebyshev doesn't quite work since the mean of $S_n/\sigma$ is not equal to 1. You should follow Mike's lead below, and consider $S_n^2/\sigma^2$ instead. Then you can use the result in http://math.stackexchange.com/questions/72975/variance-of-sample-variance/73080#73080 to conclude. – 2011-10-18
-
0Is Var (Sn) = sigma ^ 2 / root (n)? Otherwise I don't know how to make the limit 0? – 2011-10-18
-
0@ Byron Why does the mean of Sn / sigma not equal 1? Isnt E(Sn) = sigma? – 2011-10-18
-
1$S_n^2$ is an unbiased estimator for $\sigma^2$. $S_n$ is *not* an unbiased estimator for $\sigma$. By the way, you make the variance go to zero using http://math.stackexchange.com/questions/72975/variance-of-sample-variance/73080#73080 – 2011-10-18
1 Answers
1
One way to do this is show that $\frac{S_n^2}{\sigma^2}$ converges in probability to 1 using the same method you tried (invoking Chebyshev's Inequality), and then noting (or proving if you didn't know already) that if $X_1, X_2, \ldots$ converge in probability to $X$ and $g$ is a continuous function, then $g(X_{1}), g(X_{2}), \ldots$ converge in probability to $g(X)$.