Note: The question was previously asked in Prove that there exists a nearest point in a closed set $A \subset \mathbf{R}^n$ to a point outside of $A$ answered with a wonderful simple proof.
This question is to clarify a similar but alternate proof of the same theorem in "The Mathematics of Nonlinear Programming" by Peressini, Sullivan, and Uhl.
Theorem: If $C$ is a closed (convex or not) subset of $\mathbb{R}^n$ and if $y \in \mathbb{R}^n$ does not belong to $C$, then there is a vector $x^* \in C$ that is closest to $y$, that is, $\forall x \in C$: $$\left\Vert y - x^* \right\Vert \leq \left\Vert y - x\right\Vert$$ The first half of the proof is stated:
Let $\alpha$ be the largest number such that $\forall x \in C, \ \alpha \leq \left\Vert y -x \right\Vert$. Then there is a sequence $\{x^{(k)}\}$ of elements of $C$ such that: $$\alpha = \lim_k \left\Vert y - x^{(k)} \right\Vert$$
Comment: $\alpha$ is just the infinium of the distance of $C$ and $y$
Question: How did we get to the existence of sequence $\{x^{(k)}\}$?
I understand $C$ is a closed set such that limit of a sequence in $C$ is also in $C$, but I don't think that's relevant to the distance function. I knew before that the distance function or $\left\Vert y - x \right\Vert$ for fixed $y$ is a continuous function, and I think that's the key, but the book has omitted this property, as in it's not mentioned anywhere and this book is more for undergraduates with little experience in Analysis.