Assuming that $X$ and $Y$ are independent, you can use the following, standard, result:
Let $X_1$ and $X_2$ be discrete, independent random variables with ${\rm dist }(X_1)={\rm B}(n,p)$ and ${\rm dist }(X_2)={\rm B}(m,p)$, where $B(n,p)$ denotes the Binomial distribution with $n$ trials and success factor $p$.
By independence, it would seem that the total number of successes in $n$ trials of $X_1$ and $m$ trials of $X_2$ should be a binomial variable with parameters $n+m$ and $p$. We now show that this is, indeed, the case.
Let $Y=X_1+X_2$. We will find the probability mass function of $Y$. Since $Y$ is the total number of successes in $n$ trials of $X_1$ and $m$ trials of $X_2$, the random variable $Y$ takes the values $0$, $1$, $\ldots\,$, $n+m$. Using the Convolution Theorem, for $0\le k\le n+m$, we have:
$ \eqalign{ p_Y(k)&=\sum_{i=0}^kP[X_1=i,X_2=k-i]\cr &=\sum_{i=0}^kP[X_1=i]\cdot P[ X_2=k-i]\cr &=\sum_{i=0}^k{n\choose i}(1-p)^{n-i}p^i\cdot{m\choose k-i}(1-p)^{m-(k-i)}p^{k-i}\cr % &=\sum_{i=1}^k{n\choose i}{m\choose k-i}(1-p)^{m+n}p^{k}\cr &=(1-p)^{m+n-k }p^{k}\sum_{i=0}^k{n\choose i}{m\choose k-i}\cr &={m+n\choose k}(1-p)^{m+n-k}p^{k }. } $ Thus, ${\rm dist }(X_1+X_2)={\rm B}(n+m,p)$.
In the above, we used the following:
Lemma
For any positive integers $n$, $m$, and $k\le n+m$: $ \sum_{i=0}^{k} {n\choose i}{m\choose k-i} = {n+m\choose k}. $
Proof: Apply the Binomial Theorem to the equality
$ (1+x)^n(1+x)^m=(1+x)^{n+m} $ to obtain $\tag{1} \sum_{i=0}^n{n\choose i}x^{n-i}\cdot\sum_{j=0}^m{m\choose j}x^{m-j}=\sum_{k=0}^{n+m}{n+m\choose k}x^{n+m-k}. $ But $ \eqalign{ \sum_{i=0}^n{n\choose i}x^{n-i}\cdot\sum_{j=0}^m{m\choose j}x^{m-j} &=\sum_{i=0}^n{n\choose i}\cdot\Bigl[\sum_{j=0}^m{m\choose j}x^{m-j}\bigr]x^{n-i}\cr &=\sum_{i=0}^n\Bigl[\sum_{j=0}^m{n\choose i}{m\choose j}x^{n+m-(i+j)}\Bigr].\cr } $ Now, terms of the form $x^{n+m-k}$ on the right hand side of the above equality are obtained only when $0\le i\le k$ and $j=k-i$. Thus, the $x^{n+m-k}$-th term of the left hand side of equation $(1)$ is: $ \sum_{i=0}^k{n\choose i}{m\choose k-i}x^{m+n-k}. $ Since the $x^{n+m-k}$-th term of the right hand side of equation $(1)$ is $ {n+m\choose k}x^{n+m-k}, $ we have $ \sum_{i=0}^k{n\choose i}{m\choose k-i}={n+m\choose k}, $ as desired.
Convolution Theorem:
The probability mass function of the sum of two independent discrete variables is the convolution of their probability mass functions:
Let $X_1$ and $X_2$ be independent, discrete random variables that take integer values with respective probability mass functions $p_{X_1}$ and $p_{X_2}$. Let $Y=X_1+X_2$. Then for each admissable $k$: $ p_Y(k)=\sum_{i\le\, k}p_{X_1}(i)p_{X_2}(k-i). $
The sum appearing on the right hand side of the above equality is called the convolution of $p_{X_1}$ and $p_{X_2}$.
Proof: Exercise.