By the tower property, $E[X_{2}] = E[ E[X_{2} | X_{1} = x_{1}] ]$, so I would first start by solving for $E[X_{2} | X_{1} = x_{1}]$: $E[X_{2}|X_{1} = x_{1}] = \sum_{x_{2}=0}^{x_{1}}x_{2}\cdot{}P(X_{2}=x_{2}|X_{1}=x_{1}) = \sum_{x_{2}=0}^{x_{1}}x_{2}\cdot{}\frac{P(X_{2}=x_{2} \cap X_{1}=x_{1})}{P(X_{1}=x_{1})} $
$ = \sum_{x_{2}=0}^{x_{1}}x_{2}\cdot{}\frac{\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr]}{\displaystyle\sum_{x_{2}=0}^{x_{1}}\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr]}.$
Next, write down the marginal probability mass function of $X_{1}$, which is given by summing out the $x_{2}$ variable, which is the numerator above.
$ P(X_{1} = x_{1}) = \sum_{x_{2}=0}^{x_{1}}\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr].$
Now you can compute $E[X_{2}]$ as the expectation of $E[X_{2}|X_{1}=x_{1}]$ using the PMF of $X_{1}$: $ E[X_{2}] = \sum_{x_{1}=1}^{5}\Biggl[\sum_{x_{2}=0}^{x_{1}}x_{2}\cdot{}\frac{\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr]}{\displaystyle\sum_{x_{2}=0}^{x_{1}}\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr]}\cdot{}\sum_{x_{2}=0}^{x_{1}}\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr]\Biggr] $
$ = \sum_{x_{1}=1}^{5}\sum_{x_{2}=0}^{x_{1}}x_{2}\cdot{}\biggl[\begin{pmatrix} x_{1} \\ x_{2}\end{pmatrix} \biggl(\frac{1}{2}\biggr)^{x_{1}}\biggl( \frac{x_{1}}{15}\biggr)\biggr]$
which is just the basic double sum formula for $E[X_{2}]$ as expected.
As for what these sums work out to, I would probably just use SymPy, Maple, or Mathematica if you want closed forms, or NumPy or Matlab if you want the numerical results.