I'm reading The Elements of Statistical Learning. I have a question about the curse of dimensionality.
In section 2.5, p.22:
Consider $N$ data points uniformly distributed in a $p$-dimensional unit ball centered at the origin. suppose we consider a nearest-neighbor estimate at the origin. The median distance from the origin to the closest data point is given by the expression:
$$d(p,N) = \left(1-\frac{1}{2^{1/N}} \right)^{1/p}.$$
For $N=500$, $p=10$, $d(p,N)\approx0.52$, more than halfway to the boundary. Hence most data points are closer to the boundary of the sample space than to any other data point.
I accept the equation. My question is, how we deduce this conclusion?