Let $A \in \mathbb R^{n \times n}$. The definition of eigenvalues that I was told of was, $A\vec x = \lambda \vec x$ for some $\lambda \in \mathbb R$. Furthermore, we commonly refer to a set $S$ as the eigenvalues of $A$. To me, this seems that $S$ is always the same for a fixed matrix $A$. However, if $\vec x$ is a vector such that $A\vec x = \lambda \vec x$. Then, for all $c \in \mathbb R$, isn't $c\vec x$ also a vector that satisfies this? Since, $$A(c\vec x) = c(A\vec x) = (c\lambda) \vec x$$ Thus, it seems to me that $c\lambda$ is also an eigenvalue, and therefore also in $S$. Thus, it looks like every element of $\mathbb R$ is an eigenvalue, so long as the matrix has at least one such non-zero eigenvector. I further understand that solving for $|A - I\lambda| = 0$ yields exactly $n$ (possibly non-distinct) eigenvalues.
My question is, what is the canonical set of eigenvalues? And can two different sets $S_1$ and $S_2$ both be the "set of eigenvalues of matrix $A$". If so, how can the definition of something like Spectral radius be well defined?
Spectral radius is defined as $$\rho(A) = \max \{\lambda_1, \dots, \lambda_n\}$$
By my previous claim that every element of $\mathbb R$ is an eigenvalue to a corresponding eigenvector, it feels like I can always find a larger eigenvalue than $\rho(A)$.