Prove the statement: $X_n \to X$ in distribution and $Y_n \to c$ in probability, then $Y_n X_n \to cX$ in distribution.
Notice that $X_n Y_n = (Y_n - c)X_n + cX_n$. Now define $W_n = (Y_n - c)X_n$ and $Z_n =cX_n$. We have then $W_n + Z_n \to W+Z$ where $Z=cX$.
We need to prove that $W_n\to 0$
Using characteristic functions ($f_{W_n}$), we need to prove that $f_{W_n}\to f_W$, where $f_w = E[e^{it0}]=e $
By the total expectation, $E[e^{it(Y_n - c)X_n}]=E[E[e^{it(Y_n - c)X_n}\mid X_n]]$ Given that for each $x_n$, the function $g(j)=e^{it(j-c)x_n}$ is continuous and bounded (assuming $|X_n|<\infty$), we get: \begin{align} E[g(Y_n)\mid X_n]\to E[g(Y)\mid X]=e\\ E[E[e^{it(Y_n - c)X_n}\mid X_n]]\to E[e]=e \end{align} In the filtration $\sigma(X)=\sigma(X_1,\dots,X_n,\dots,X_\infty)$
And the result follows.
Im not sure if I can use the total expectation that way. I have the feeling that convergence $E[g(Y_n)\mid X_n]\to E[g(Y)\mid X]=e$ is "pointwise" for $x_n=X_n(\alpha)$, and I don't know if I can just assume that is known so I can use $g$ to prove convergence. Did I do something wrong? And if I did, any suggestions?