Assume one knows the eigenvalues $(\lambda_i)$ of a real matrix $M$ of size $n \times n$. Let $b$ be a vector in $\mathbb{R}^n$ and construct the following matrix $J$ by: $$J_{ij} = M_{ij}b_j$$ Can we deduce the eigenvalues of $J$ from the eigenvalues $\lambda_i$ and the vector $b$ ?
Eigenvalues problem
0
$\begingroup$
linear-algebra
matrices
eigenvalues-eigenvectors
-
1J is a vector... – 2011-10-10
-
0@percusse : no i think J is a matrix – 2011-10-10
-
1Okay, so you're postmultiplying $M$ with the diagonal matrix $\mathrm{diag}(b)$... I would say there doesn't seem to be a nice relationship between the eigenvalues of that and the eigenvalues of your original matrix... – 2011-10-10
-
0To make it clearer what $J$ is, you could write $$J=M\mathrm{diag}(\vec{b}).$$ – 2011-10-10
-
1[A related question...](http://math.stackexchange.com/questions/58252) – 2011-10-10
-
0@mellow It was a little bit puzzling for me since $b$ is a vector. I see that you are not summing over indices. – 2011-10-10
1 Answers
3
No. For example, suppose $n=2$ and we know that $M$ has eigenvalues $0$ and $2$ and that $b=(1,0)$. Then we might have $M=\begin{pmatrix} 2 & 0 \\ 0 & 0 \end{pmatrix}$, in which case $J=\begin{pmatrix} 2 & 0 \\ 0 & 0 \end{pmatrix}$ has eigenvalues $0$ and $2$. On the other hand, we might have $M=\begin{pmatrix} 0 & 0 \\ 0 & 2 \end{pmatrix}$, in which case $J=\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$ has $0$ as its only eigenvalue.