1
$\begingroup$

I am studying Eigenvalues and I have the following question:

Having the normalised quadratic form given by :

$$\frac{x^TAx}{x^Tx}$$ where $x \in\mathbb{R}^2$

I am having difficulties in showing that the Eigenvectors of a given 4 x 4 matrix make this quantity stationary ( saddle points, max. and min.). I also know that just the symmetric part of that given 4 x 4 matrix is used.

I found that $\dfrac{x^TAx}{x^Tx} $ is called Rayleigh quotient but I am not sure if it is correct and I do not know how to start.

Can anyone help me on this?

Thanks

  • 0
    Yes, identifying this expression as a Rayleigh quotient is a very very good thing. Rayleigh quotients have many properties, in particular extremal properties linked to extremal eigenvalues/eigenvectors.2017-02-16

1 Answers 1

1

UPDATE: I'm afraid that I've misunderstood your question first (for archiving reasons my old answer can be found below). I don't know a source for a proof of the statement "every eigenvector of the Rayleigh quotient is a critical point", but it shouldn't be so difficult to prove it. So lets try to sketch out the basic points of a proof: We would like to avoid calculating such a nasty derivative like $DR_A$ may be. To study the critical points of $R_A(x):=\frac{x^TAx}{x^Tx}$ you should make it clear to yourself that $R_A(\alpha v)$ is constant for every direction $v\neq 0$ and $\alpha>0$. For the directional derivative that means $D_vR_A(v)=0$. (Interesting point: $R_A$ is fully described by the unit sphere $S_{n-1}:=\{x:x^Tx-1=0\}$)

Let be $v$ an arbitrary normalized eigenvector of $A$ with $Av=\lambda v$. Remember that we can orthogonally decompose the space $\mathbb{R}^n=\langle v\rangle \oplus\langle v\rangle^\bot$, where $\langle v\rangle=\mathbb{R}v$ is the one dimensional vector space spanned by $v$ and $\langle v\rangle^\bot$ is its orthogonal complement. Furthermore, since the directional derivative is linear in the direction argument, its worth to show that $D_wR_A(v)$ vanishes for $w\in\langle v\rangle^\bot$ (because every direction $u$ can be written as $u=u_1\oplus u_2\in\langle v\rangle \oplus\langle v\rangle^\bot$ and therefore $D_uR_A(v) = D_{u_1+u_2}R_A(v)= D_{u_1}R_A(v)+D_{u_2}R_A(v) = 0$). Now let be $w\in\langle v\rangle^\bot$. We make the following observations:

  • $\langle v\rangle^\bot$ is $A$-invariant (that means $A\langle v\rangle^\bot\subset\langle v\rangle^\bot$) and thus $Aw\in\langle v\rangle^\bot$. It follows that $Aw\bot v$ and thus $v^TAw=\langle v, Aw\rangle=0$.
  • A simple calculation shows $(v+tw)^TA(v+tw) = v^TAv+2tv^TAw+t^2w^TAw$ and (analogously or by choosing $A$ as identity) $(v+tw)^T(v+tw) = v^Tv+2tv^Tw+t^2w^Tw$ for every $t\in\mathbb{R}$.
  • Since $\lambda$ is the corresponding eigenvalue of (the normalized) $v$ it is $v^TAv=\lambda v^Tv=\lambda$

Now we calculate the directional derivative of $R_A$ in $v$: \begin{align} D_wR_A(v) &= \lim_{t\rightarrow 0} \frac{1}{t}\left(R_A(v+tw)-R_A(v)\right) \\ &= \lim_{t\rightarrow 0} \frac{1}{t}\left(\frac{v^TAv+2tv^TAw+t^2w^TAw}{v^Tv+2tv^Tw+t^2w^Tw} - \frac{v^TAv}{v^Tv}\right)\\ &= \lim_{t\rightarrow 0} \frac{1}{t}\left(\frac{\lambda+0+t^2w^TAw}{1+0+t^2w^Tw} - \frac{\lambda}{1}\right)\\ &= \lim_{t\rightarrow 0} \frac{1}{t}\left(\frac{\lambda+t^2w^TAw}{1+t^2w^Tw} - \frac{(1+t^2w^Tw)\lambda}{1+t^2w^Tw}\right)\\ &= \lim_{t\rightarrow 0} \frac{1}{t}\left(\frac{-t^2w^Tw\lambda+t^2w^TAw}{1+t^2w^Tw}\right)\\ &= \lim_{t\rightarrow 0} \frac{1}{t}\left(\frac{t^2(w^TAw-w^Tw\lambda)}{1+t^2w^Tw}\right)\\ &= \lim_{t\rightarrow 0} \frac{t(w^TAw-w^Tw\lambda)}{1+t^2w^Tw}\\ &= \frac{0}{1+0} = 0 \end{align} Thus $D_wR_A(v)$ vanishes and we have shown that directional derivative vanishes in every direction. Hence $v$ is actually a critical point of $R_A$. Since $v$ was an arbitrary eigenvector, we have shown that every eigenvector is a critical point.

EDIT: Some notes about the opposite direction of the statement: I think one can show that each critical point of $R_A$ is also a eigenvector of $A$. (You mentioned that in a comment but not in your original question) I'm trying to sketch my idea about this: Since $R_A$ is constant on straight lines through the origin, one can make clear to himself that for each a critical point $x$ of $R_A$ the normalized point $y:=\frac{x}{\|x\|_2}$ is also critical. Especially $y$ is a critical point of the restriction $R_A|_{S_{n-1}}$ on the unit sphere and thus we can make use of Lagrange multipliers: Let be $g(x):=x^Tx-1$ and $f(x):=x^TAx$. ($f$ is equal to $R_A$ on the zeros of $g$) Now by the Lagrange multipliers theorem there exists a $\lambda\in\mathbb{R}$ with $$f'(y)=\lambda \nabla g(y).$$ With $f'(x)=2x^TA$ and $\nabla g(x)=2x^T$ this leads to $2y^TA=\lambda 2y^T$ and thus $A^Ty=\lambda y$. If $A$ is assumed as symmetric, then $y$ is indeed an eigenvector of $A$. Well its a really rough sketch, but I think the idea is plausible.

Old answer: W.l.o.g. let $A$ be symmetric. So you can diagonalize $A=U D U^T$. Now try to substitute $Ux$ in $\frac{x^TAx}{x^Tx}$ and you will get a polynomial without mixed terms and the eigenvalues of $A$ as coefficients. Now you can see the wanted properties.

EDIT: you can diagonalize a symmetric (real) matrix (https://en.wikipedia.org/wiki/Spectral_theorem#Normal_matrices) $A=U D U^T$ with a diagonal matrix $D=\operatorname{diag}(\lambda_1,\ldots,\lambda_n)$. Now you have the eigenvalues on the diagonal and the columns of $U$ are the eigenvectors. Lets assume that $x$ is normalized (meaning $x^Tx=\|x\|_2^2=1$). It follows with $y=Ux$: $$y^TAy = (Ux)^TA(Ux) = x^T(U^TAU)x = x^TDx = \sum_{i=1}^n \lambda_i x_i^2.$$ Another observation may help you: It is $x^Tx=\|x\|_2^2=\|x\|_2\cdot \|x\|_2$ and therefore $$\frac{x^TAx}{x^Tx} = \frac{x^TAx}{\|x\|_2\|x\|_2} = \left(\frac{x}{\|x\|_2}\right)^T A \left(\frac{x}{\|x\|_2}\right).$$ (This justifies the assumption of normalized vectors $x$)

  • 0
    I am sorry but you must improve the way you write. I understand what you say because I know the subject, but the "now you can see the wanted properties" is definitely not good mathematical style. In such a case, it is much better to provide the OP with a thorough reference like this one (https://en.wikipedia.org/wiki/Rayleigh_quotient)2017-02-16
  • 0
    Hi, I was reading about the topic and I did not find anything that could help me how to show how the eigenvectors of that given matrix makes it stationary. Can you provide more help? thanks2017-02-17
  • 0
    @JeanMarie: Well my last remark was not really meant as a solution. The question was "how do I start", therefore I gave that hint to suggest an intuition and nothing more. ;) In my opinion, the key to get quickly a clue of the expression $x^TAx$ for normalized $x$ is the spectral theorem and the rest ist a no-brainer. If the coordinate system is rotated by U, one can definitely see many properties of $x^TAx$. All proofs in papers I've read so far are arguing in this way2017-02-17
  • 0
    @user290335 I've edited my answer. I hope that this is helpful for you.2017-02-17
  • 0
    I appreciate all the effort you have made in your new writing.2017-02-17
  • 0
    @JeanMarie Thank you :)2017-02-17
  • 0
    Hi, I am trying to show that all critical points of the Rayleigh quotient are the eigenvectors of the symmetric part of the matrix A. But I cannot find any proof. I read about what you write and it is very difficult to me to understand. Can you point me to a website where I can read more about this? Thanks2017-02-27
  • 0
    @user290335: Sorry for misunderstanding your question, but I think my update above could be satisfactorily for you.2017-02-28
  • 0
    This proof can be used for all matrices? I mean 2 x2, 4 x 4 etc.?2017-02-28
  • 0
    No problem, thank you very much for your help.2017-02-28
  • 0
    Yes, this proof made no assumptions to the dimension of the matrix. Btw I've added a sketch of a prove for the other direction of the statement. (I didnt know it was a mistake in your comment, but you said "I am trying to show that all critical points of the Rayleigh quotient are the eigenvectors[...]". ;) )2017-02-28
  • 0
    I am sorry I did a mistake in my question. What I needed to know was indeed what I mentioned in my previous comment. I was reading about this topic and I understood what you write. Thank you once again.2017-03-01