Here is a simple explanation not necessarily from linear algebra. We have
$\|A\|_2=\max_{\|x\|=1}\|Ax\|$
where $\|\cdot\|$ is simple euclidean norm. This is a constrained optimisation problem with Lagrange function:
L(x,\lambda)=\|Ax\|^2-\lambda(\|x\|^2-1)=x'A^2x-\lambda(x'x-1)
here I took squares which do not change anything, but makes the following step easier. Taking derivative with respect to $x$ and equating it to zero we get
$A^2x-\lambda x=0$
the solution for this problem is the eigenvector of $A^2$. Since $A^2$ is symmetric, all its eigenvalues are real. So x'A^2x will achieve maximum on set $\|x\|^2=1$ with maximal eigenvalue of $A^2$. Now since $A$ is symmetric it admits representation
A=Q\Lambda Q'
with $Q$ the orthogonal matrix and $\Lambda$ diagonal with eigenvalues in diagonals. For $A^2$ we get
A^2=Q\Lambda^2 Q'
so the eigenvalues of $A^2$ are squares of eigenvalues of $A$. The norm $\|A\|_2$ is the square root taken from maximum x'A^2x on x'x=1, which will be the square root of maximal eigenvalue of $A^2$ which is the maximal absolute eigenvalue of $A$.