I am trying to follow equation (1.13) in MacKay's Information Theory textbook (http://www.inference.phy.cam.ac.uk/itprnn/book.pdf). It is:
$$ \ln \binom{N}{r} = \ln \frac{N!}{(N-r)! r!} \approx (N-r) \ln \frac{N}{N-r} + r \ln \frac{N}{r} $$
I am using the approximation in equation (1.12):
$$ \ln x! \approx x \ln x - x + \frac12 \ln 2 \pi x$$
Thus, if I expand the expression in the middle of equation (1.13), I should get 9 terms:
$$ \ln \frac{N!}{(N-r)! r!} = \ln N! - \ln (N-r)! - \ln r! $$ $$ \approx N \ln N - N + \frac12 \ln 2 \pi N - (N-r) \ln (N-r) + (N-r) - \frac12 \ln 2\pi (N-r) - r \ln r + r - \frac12 \ln 2 \pi r$$
Now, I am stuck. If I remove the $\ln$ terms (seems to be valid for large factorials, http://mathworld.wolfram.com/BinomialDistribution.html equation (37)), I am still stuck. The exact problem seems to be manipulating $N \ln N, - (N-r) \ln (N-r), - r \ln r$ into the form MacKay has in equation (1.13).
Thanks.