Let's first compute the moment generating function from the p.m.f.: $ \begin{eqnarray} M(t_1, t_2) &=& \mathbb{E}( \exp( t_1 X + t_2 Y) ) = \sum_{x=0}^\infty \sum_{y=x}^\infty \mathrm{e}^{-2} \frac{\exp(t_1 x)}{x!} \frac{\exp(t_2 y)}{(y-x)!} \\ &=& \sum_{x=0}^\infty \sum_{y=0}^\infty \mathrm{e}^{-2} \frac{\exp(t_1 x)}{x!} \frac{\exp(t_2 (y+x))}{(y)!} = M_{\mathrm{Po}(1)}(t_1+t_2) M_{\mathrm{Po}(1)}(t_2) \end{eqnarray} $ where $M_{\mathrm{Po}(1)}(t) = \exp\left( \mathrm{e}^t - 1 \right)$. The factorization of $M(t_1, t_2)$ means that the random variable in question corresponds to vector $(X, Y) = (Z_1, Z_1+Z_2)$, where $Z_1$ and $Z_2$ are i.i.d. Poisson random variate with unit mean.
Thus $\mathbb{E}(X Y) = \mathbb{E}\left( Z_1(Z_1+Z_2) \right) = \mathbb{E}\left( Z_1^2 \right) + \mathbb{E}\left( Z_1 \right) \mathbb{E}\left( Z_2 \right) = (1^2 + 1) + 1 \times 1 = 3$. Using moment generating function
$ \begin{eqnarray} \mathbb{E}(X Y) &=& \left. \frac{\mathrm{d}^2}{\mathrm{d} t_1 \mathrm{d} t_2} M(t_1, t_2) \right\vert_{t_1=0, t_2=0} = \left. \left( M(t_1,t_2) \mathrm{e}^{t_1+t_2} \left( 1 + \mathrm{e}^{t_1} + \mathrm{e}^{t_1+t_2} \right)\right)\right\vert_{t_1=0, t_2=0} \\&=& 1 \times 1 \times (1+1+1 ) = 3 \end{eqnarray} $
Now, let's turn to $\mathbb{E}\left( X \vert Y=y \right)$. Doing computation directly: $ \mathbb{E}(X \vert Y=y) = \frac{ \sum_{x=0}^\infty x \frac{1}{\mathrm{e}^2} \frac{1}{x!} \frac{1}{(y-x)!}}{\sum_{x=0}^\infty \frac{1}{\mathrm{e}^2} \frac{1}{x!} \frac{1}{(y-x)!}} = \frac{ \sum_{x=0}^y x \frac{1}{\mathrm{e}^2} \frac{1}{x!} \frac{1}{(y-x)!}}{\sum_{x=0}^y \frac{1}{\mathrm{e}^2} \frac{1}{x!} \frac{1}{(y-x)!}} $ Now the changing variables $x \mapsto y-x$ in the sum in the numerator: $ \mathbb{E}(X \vert Y=y) = y - \mathbb{E}(X \vert Y=y) \qquad \implies \qquad \mathbb{E}(X \vert Y=y) = \frac{y}{2} $ In order to get the same result using mgf, one should remark, that for positive discrete r.v. the $M_{\mathrm{Po(1)}}(\log t)$ is the probability generating function of Poisson random variable with unit mean, indeed: $ M(\log t) = \exp( t -1 ) = \sum_{x=0}^\infty \frac{\mathrm{e}^{-1}}{x!} t^x $ It would then follow that $ \begin{eqnarray} \mathbb{E}(X \vert Y=y) &=& \frac{ [t_2]^y \left. \frac{\mathrm{d}}{\mathrm{d} t_1} M\left( t_1, \log t_2 \right) \right\vert_{t_1=0}}{ [t_2]^y M\left(0, \log t_2 \right) } = \frac{[t_2]^y \left. \frac{\mathrm{d}}{\mathrm{d} t_1} \exp\left( \mathrm{e}^{t_1} t_2 + t_2 - 2 \right) \right\vert_{t_1=0}}{ [t_2]^y M\left(0, \log t_2 \right) } \\ &=& \frac{[t_2]^y \left. t_2 \exp\left( t_2 \mathrm{e}^{t_1} + t_2 + t_1 - 2 \right) \right\vert_{t_1=0}}{ [t_2]^y \exp(2(t_2-1)) } = \frac{[t_2]^y \left( t_2 \exp\left( 2 (t_2 - 1) \right) \right)}{ [t_2]^y \exp(2(t_2-1)) } \\ &=& \frac{ 2^{y-1} \mathrm{e}^{-2} /(y-1)! }{ 2^{y} \mathrm{e}^{-2} /(y)!} = \frac{y}{2} \end{eqnarray} $
Added As suggested by Didier Piau, the conditional expectation also follows by symmetry, using representation of $(X, Y)$ in terms of $Z_1$ and $Z_2$. Indeed:
$ \mathbb{E}(X \vert Y=y) = \mathbb{E}(Z_1 \vert Y=y) \stackrel{\text{symmetry}}{=} \frac{1}{2} \left( \mathbb{E}(Z_1 \vert Y=y) + \mathbb{E}(Z_2 \vert Y=y)) \right) = \frac{1}{2} \mathbb{E}(Z_1 +Z_2 \vert Y=y)= \frac{y}{2} $ Where the symmetry refers to the fact that, due to $Z_1$ and $Z_2$ being i.i.d., the following equation holds: $\mathbb{E}(Z_1 \vert Y=y) = \mathbb{E}(Z_1 \vert Z_1+Z_2 = y) = \mathbb{E}(Z_2 \vert Z_1+Z_2 = y) = \mathbb{E}(Z_2 \vert Y=y).$