As the last sentence in André's answer suggests, it is easier to find the joint
distribution of $X$ and $Y$.
Suppose that we are given that $Z = n$. Then, the conditional distribution
of $X$ is given to be $\tt{Binomial}$$(n,p)$, while the conditional distribution
of $Y = Z-X = n-X$ can be deduced to be $\tt{Binomial}$$(n,1-p)$. Think of
this as $X$ is the number of successes and $Y$ the number of failures
on $n$ independent trials, where the probability of success is $p$. Notice
also that $X$ and $Y$ are not conditionally independent random variables
given that $Z = n$; they are very much dependent random variables since
$X+Y=n$. But we do have that for $k, \ell \geq 0$,
$$p_{X,Y\mid Z=n}(k,\ell\mid Z=n) = P\{X=k, Y=\ell\mid Z=n\}
= \begin{cases} \binom{n}{k}p^k(1-p)^{n-k}, & 0\leq k\leq n, \ell = n-k,\\
\quad \\
0, & \text{otherwise.}\end{cases}$$
Consequently, the unconditional joint probability mass function can be found
via the law of total probability as
$$\begin{align*}
p_{X,Y\mid Z=n}(k,\ell)
&= \sum_{n=0}^\infty p_{X,Y\mid Z=n}(k,\ell\mid Z=n)P\{Z=n\}\\
&= \binom{k+\ell}{k}p^k(1-p)^{\ell}\cdot \exp(-\lambda)\frac{\lambda^{k+\ell}}{(k+\ell)!}\\
&= \frac{(k+\ell)!}{k!\ell!}(\lambda p)^k(\lambda(1-p))^{\ell}\cdot(\exp(-\lambda p)\exp(-\lambda(1-p))\frac{1}{(k+\ell)!}\\
&= \exp(-\lambda p)\frac{(\lambda p)^k}{k!}
\cdot\exp(-\lambda(1-p))\frac{(\lambda(1-p))^{\ell}}{\ell!}
\end{align*}$$
where, of course, we have assumed that $k, \ell \geq 0$.
The result shows that $X$ and $Y$ are independent random
variables; in fact,
independent Poisson random variables with parameters $\lambda p$ and
$\lambda(1-p)$ respectively.