0
$\begingroup$

Let $A \in \mathbb R^{n \times n}$. The definition of eigenvalues that I was told of was, $A\vec x = \lambda \vec x$ for some $\lambda \in \mathbb R$. Furthermore, we commonly refer to a set $S$ as the eigenvalues of $A$. To me, this seems that $S$ is always the same for a fixed matrix $A$. However, if $\vec x$ is a vector such that $A\vec x = \lambda \vec x$. Then, for all $c \in \mathbb R$, isn't $c\vec x$ also a vector that satisfies this? Since, $$A(c\vec x) = c(A\vec x) = (c\lambda) \vec x$$ Thus, it seems to me that $c\lambda$ is also an eigenvalue, and therefore also in $S$. Thus, it looks like every element of $\mathbb R$ is an eigenvalue, so long as the matrix has at least one such non-zero eigenvector. I further understand that solving for $|A - I\lambda| = 0$ yields exactly $n$ (possibly non-distinct) eigenvalues.

My question is, what is the canonical set of eigenvalues? And can two different sets $S_1$ and $S_2$ both be the "set of eigenvalues of matrix $A$". If so, how can the definition of something like Spectral radius be well defined?

Spectral radius is defined as $$\rho(A) = \max \{\lambda_1, \dots, \lambda_n\}$$

By my previous claim that every element of $\mathbb R$ is an eigenvalue to a corresponding eigenvector, it feels like I can always find a larger eigenvalue than $\rho(A)$.

  • 1
    It is $A(c\vec x) = c(A\vec x) = (c\lambda) \vec x=\lambda(c\vec x)$ and so the eigenvalues don't change.2017-02-02

2 Answers 2

3

You use $A(c\mathbf{x}) = (c\lambda)\mathbf{x}$ to claim that $c \lambda$ is an eigenvalue of the matrix A. This is incorrect: You have $A(c\mathbf{x}) = c\lambda \mathbf{x} = \lambda( c \mathbf{x} )$, so all this equation does is demonstrate that $\lambda$ is an eigenvalue for the eigenvector $c\mathbf{x}$ as well as the vector $\mathbf{x}$. This does not imply that $c\lambda$ is an eigenvalue.

0

The problem is that your first equation $$A(cx) = (c\lambda)x$$ is not in the form of an eigenvector equation. Let, $$\begin{align*} v_1 &:= cx \\ \lambda^* &:= c\lambda \\ v_2 &:= x \end{align*}$$ Then your equation reads, $$Av_1 = \lambda^*v_2$$ where $v_1 \neq v_2$ which is clearly just some vector equation, not an eigenvector equation (even if it is true that $v_1$ is a scaling of $v_2$.

Instead, we reorganize the equation (through a simple change in multiplication association) as, $$A(cx) = \lambda(cx)$$ so that if we define $$v := cx$$ it reads $$Av = \lambda v$$ which is indeed the form of an eigenvector equation. Note that the same $\lambda$ is still sitting there, unchanged.

The take away is that eigenvectors are not unique, or rather, they are unique up to a change in scale. In that light one might call them "eigendirections." The associated eigenvalues, however, are unique given $A$.