The title says it. I know that if limiting variable $Y$ is constant a.s. (so that $\mathbb{P}(Y=c)=1)$ then the convergence in probability is equivalent to the convergence in law, i.e. $Y_n\overset{\mathbb{P}}{\longrightarrow}c \iff Y_n\overset{\mathcal{D}}{\longrightarrow}c,$ and then Slutsky's theorem asserts that $X_n\cdot Y_n\overset{\mathcal{D}}{\longrightarrow}X\cdot c$. But what about the case when $Y$ is not constant? Does $X_n\overset{\mathcal{D}}{\longrightarrow}X$, $Y_n\overset{\mathbb{P}}{\longrightarrow}Y$ imply $X_n\cdot Y_n\overset{\mathcal{D}}{\longrightarrow}X\cdot Y$ ?
I would appreciate any hints.
$X_n\overset{\mathcal{D}}{\rightarrow}X$, $Y_n\overset{\mathbb{P}}{\rightarrow}Y \implies X_n\cdot Y_n\overset{\mathcal{D}}{\rightarrow}X\cdot Y\ ?$
3
$\begingroup$
probability-theory
convergence-divergence
-
0@dtldarek How can I see this? – 2012-10-21
1 Answers
5
Let $Y$ represent a fair coin with sides valued $0$ (zero) and $1$ (one). Set $Y_n = Y$, $X = Y$, $X_n = 1-Y$. The premise is fulfilled, but $X_n\cdot Y_n = 0\overset{\mathcal{D}}{\nrightarrow}Y = X\cdot Y$.
-
1+1 for the economic example. Using $\pm1$ Bernoulli random variables, instead of 0-1 ones, might be even more spectacular. – 2012-10-21