1
$\begingroup$

Suppose $X_n$ and $Y_n$ are sequences of random variables, defined on a common probability space, and such that $X_n-Y_n \to C$ converges in distribution to the constant r.v. $C\equiv c$ as $n \to \infty$ (and also $Y_n \to Y$). Does it then hold, without making any further assumptions on the r.v.'s, that $X_n \to C+Y$ in distr.?

If not, what are the minimum requirements on $X_n$ and $Y_n$ for this to be true?

What can be said about the case, where $C$ is replaced by an arbitrary random variable?

  • 0
    Do we have $Y_n\to Y$ _in distribution_?2012-11-11

2 Answers 2

4

Lemma 1. If $\{A_n\}$ is a sequence of random variables converges in distribution to a constant $c$, then it converges in probability to $c$.

Fix $\varepsilon>0$, $f$ a function such that $f(x)=0$ if $|x-c|\geqslant 2\varepsilon$, $f(t)=1$ if $|t-c|\leq\varepsilon$ and $f$ is piecewise linear. It's a bounded continuous function, so $$\int f(A_n)dP\to 1.$$ As $$\int f(A_n)dP\leqslant P(|A_n-c|\leqslant\varepsilon)+\varepsilon,$$ we have $$P(|A_n-c|>\varepsilon)\leqslant 1-\int f(A_n)dP+\varepsilon,$$ This proves convergence in probability.

Lemma 2. If $\{X_n\}$ converges in distribution to $X$, and $\{Y_n\}$ in probability to $c$, where $c$ is constant, then $\{X_n+Y_n\}$ converges in distribution to $X+c$.

Indeed, by portmanteau theorem, it's enough to check that $\int f(X_n+Y_n)dP\to \int f(X+Y)dP$ for all $f$ uniformly continuous and bounded. If $\varepsilon>0$ and $\delta$ as in the definition of uniform continuity, we have $$\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \sup |f|\cdot P(|Y_n-c|\geqslant \delta)+\varepsilon+\left|\int (f(X_n+c)-f(X+c))dP\right|.$$ As $f(c+\cdot)$ is continuous and bounded, we have $$\limsup_{n\to +\infty}\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \varepsilon,$$ proving convergence in law of $\{X_n+Y_n\}$ to $\{X_n+c\}$.

  • 0
    Just to put everything together: This solves the question in the constant case, because we have $Z_n := X_n-Y_n \to c$ in distribution and hence, by Lemma 1, also in probability. Furthermore $Y_n \to Y$ in distribution and we can apply Lemma 2 to conclude that $Z_n+Y_n = X_n \to Y + c$ in distribution.2012-11-11
  • 0
    why are you assuming $f$ to be uniformly continuous in Lemma 2? In the portmanteau theorem, it only says continuous and bounded functions.2016-10-27
  • 0
    Continuous and bounded is the definition. In almost all the versions of portmanteau theorem, we reduce the convergence to uniformly continuous and bounded function.2016-10-27
  • 0
    Lemma 2 is often credited to [Slutsky](https://en.wikipedia.org/wiki/Slutsky%27s_theorem).2016-10-27
3

EDIT: My previous claim that this is true when $C$ is constant wasn't completely thought out.

If $C$ is not constant, this is false. Basically, we can replace $C$ by any random variable with the same distribution and preserve the truth of $X_n - Y_n \to C$, but this may change the distribution of $C+Y$. For example, let $Y,Z$ be iid with any nonconstant distribution (coin flips will work), and set $X_n = 0$, $Y_n = Y$, $C = -Z$. We have $X_n - Y_n = -Y \to C$ in distribution, but $C+Y$ has a nontrivial distribution so we don't have $X_n \to C+Y$.

  • 0
    Can you detail the first part. Maybe the author of the OP didn't manage to see this.2012-11-11
  • 1
    @DavideGiraudo: In fact, what I had in mind doesn't work. But actually your deleted answer can be fixed. Lemma 1 is correct, and Lemma 2 works when $Y_n \to C$ in probability. You can bound $\int |f(X_n + Y_n) - f(X_n + C)|$ by the same argument you used, and $\int f(X_n + C) \to \int f(X+C)$ by the weak convergence of $X_n$. Then apply it with $X_n = Y_n$ and $Y_n = X_n - Y_n$ (you might want to adjust the notation).2012-11-11
  • 0
    I understand the argument above for the statement being wrong if $C$ is non-constant. For the constant case I guess the answer "on the way". @DavideGiraudo Indeed the assumption is that $Y_n\to Y$ in distribution.2012-11-11