If $X_i$ is a sequence of $\mathbb{R}^m$-valued random variables that converges either in probability or almost surely to $X$ and if $f$ is some measurable function from $\mathbb{R}^m$ into $\mathbb{R}$, does it follow that $f(X_i)$ converges to $f(X)$ as well?
I know about the continuous mapping theorem. Is there something similar for arbitrary measurable functions?
New Question: As the answer below suggested, let $\mathcal{C}$ be the set of functions $f$ that satisfy $f(X_i)$ converges in probability to $f(X)$ when $X_i$ converges in probability to $X$. What are the minimal properties of such a set?