For evaluating $var(Y_n)$ refer my comment in the question or refer the other answer.
I write this answer to prove that $Y_n \overset{p}{\longrightarrow} E(Y_n)$ only with the hypothesis that $E(X_i^2)$ is finite. The proof by Chebyshev inequality uses the fact that $\text{Var} (X_1^2) < \infty$ which is not true in general and by the given hypothesis. So following is the proof using the characteristic function.
By Taylor's theorem for complex functions, the characteristic function of any random variable, $V$, with finite mean $\mu$, can be written as:
$\varphi_V(t) = 1 + it\mu + o(t), \quad t \rightarrow 0.$
All $X_1^2,X_2^2,\cdots$ have the same characteristic function, so we will simply denote this by $\varphi_V$.
Among the basic properties of characteristic functions there are:
$\varphi_{\frac 1 n X}(t)= \varphi_X(\tfrac t n) \quad \text{and} \quad
\varphi_{X+Y}(t)=\varphi_X(t) \varphi_Y(t) \quad $ if $X$ and $Y$ are independent.
These rules can be used to calculate the characteristic function of $Y_n$ in terms of $\varphi_V$:
$\varphi_{Y_n}(t)= \left[\varphi_V\left({t \over n}\right)\right]^n = \left[1 + i\mu{t \over n} + o\left({t \over n}\right)\right]^n \, \rightarrow \, e^{it\mu}, \quad \text{as} \quad n \rightarrow \infty.$
The limit $e^{it\mu}$ is the characteristic function of the constant random variable $\mu$, and hence by the Lévy continuity theorem, $ Y_n$ converges in distribution to $\mu$.
Since $\mu$ is a constant, convergence in distribution to $\mu$ and convergence in probability to $\mu$ are equivalent.
Therefore, $Y_n \overset{p}{\longrightarrow} E(Y_n)$ whenever $E(X_i^2)$ is finite.
And $E(Y_n) = E(X_1^2) = var(X_1) + (E(X_1))^2.$