Lemma 1. If $\{A_n\}$ is a sequence of random variables converges in distribution to a constant $c$, then it converges in probability to $c$.
Fix $\varepsilon>0$, $f$ a function such that $f(x)=0$ if $|x-c|\geqslant 2\varepsilon$, $f(t)=1$ if $|t-c|\leq\varepsilon$ and $f$ is piecewise linear. It's a bounded continuous function, so $\int f(A_n)dP\to 1.$ As $\int f(A_n)dP\leqslant P(|A_n-c|\leqslant\varepsilon)+\varepsilon,$ we have $P(|A_n-c|>\varepsilon)\leqslant 1-\int f(A_n)dP+\varepsilon,$ This proves convergence in probability.
Lemma 2. If $\{X_n\}$ converges in distribution to $X$, and $\{Y_n\}$ in probability to $c$, where $c$ is constant, then $\{X_n+Y_n\}$ converges in distribution to $X+c$.
Indeed, by portmanteau theorem, it's enough to check that $\int f(X_n+Y_n)dP\to \int f(X+Y)dP$ for all $f$ uniformly continuous and bounded. If $\varepsilon>0$ and $\delta$ as in the definition of uniform continuity, we have $\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \sup |f|\cdot P(|Y_n-c|\geqslant \delta)+\varepsilon+\left|\int (f(X_n+c)-f(X+c))dP\right|.$ As $f(c+\cdot)$ is continuous and bounded, we have $\limsup_{n\to +\infty}\left|\int f(X_n+Y_n)dP-\int f(X+c)dP\right|\leqslant \varepsilon,$ proving convergence in law of $\{X_n+Y_n\}$ to $\{X_n+c\}$.