0
$\begingroup$

I ran into this sum while calculating the marginal pdf of a joint distribution function. I wasn't sure how to evaluate it, so I plugged it into Wolfram expecting something ugly. Fortunately, there's this nice closed form equality, but I don't know how to arrive at the result analytically.

Edit: Just realized that I could do this by induction. But that's rather unsatisfactory, since I wouldn't have arrived at the idea to prove this without Wolfram.

  • 0
    Could you give the original bivariate distribution that has given you this marginal ?2017-02-20
  • 0
    That is just the binomial theorem in disguise: $$\sum_{j=0}^{r}\binom{r}{j}=2^j $$2017-02-20
  • 0
    @JeanMarie: Others have already solved this. But in case you're curious, the original joint mass function was $f_{(X,Y)}( x, y) = \frac{ c(j+k) a^{j+k} }{ j! k! }, \quad j,k \geq 0$ where $c = \frac{e^{-2a}}{2a}$ and $a$ is a constant. (Source: Grimmett, *Probability and Random Processes*, Section 3.6, part of problem 8.)2017-02-20

1 Answers 1

3

$$\sum_{j=0}^r \frac{r!}{j!(r-j)!} = \sum_{j=0}^r \frac{r!}{j!(r-j)!} 1^j 1^{r-j} = (1+1)^r = 2^r$$ by the binomial theorem. Now divide by $r!$