4
$\begingroup$

I am trying to follow equation (1.13) in MacKay's Information Theory textbook (http://www.inference.phy.cam.ac.uk/itprnn/book.pdf). It is:

$ \ln \binom{N}{r} = \ln \frac{N!}{(N-r)! r!} \approx (N-r) \ln \frac{N}{N-r} + r \ln \frac{N}{r} $

I am using the approximation in equation (1.12):

$ \ln x! \approx x \ln x - x + \frac12 \ln 2 \pi x$

Thus, if I expand the expression in the middle of equation (1.13), I should get 9 terms:

$ \ln \frac{N!}{(N-r)! r!} = \ln N! - \ln (N-r)! - \ln r! $ $ \approx N \ln N - N + \frac12 \ln 2 \pi N - (N-r) \ln (N-r) + (N-r) - \frac12 \ln 2\pi (N-r) - r \ln r + r - \frac12 \ln 2 \pi r$

Now, I am stuck. If I remove the $\ln$ terms (seems to be valid for large factorials, http://mathworld.wolfram.com/BinomialDistribution.html equation (37)), I am still stuck. The exact problem seems to be manipulating $N \ln N, - (N-r) \ln (N-r), - r \ln r$ into the form MacKay has in equation (1.13).

Thanks.

1 Answers 1

5

Your target approximation can be expanded: $(N-r) \ln \frac{N}{N-r} + r \ln \frac{N}{r}$ $= (N-r) \ln N - (N-r) \ln ({N-r}) + r \ln N - r \ln r$ $= N\ln N -r\ln N + r \ln N - (N-r) \ln ({N-r}) - r \ln r$ $= N \ln N - (N-r) \ln (N-r) - r \ln r$

while your long approximation can be rewritten as $N \ln N - (N-r) \ln (N-r) - r \ln r + \frac12 \ln \left(\frac{N}{2 \pi (N-r)r} \right)$

and the last term of this is much smaller than the others if $N$ is not small and $r$ is not close to $0$ or $N$, so the two are close to each other.