4
$\begingroup$

I am trying to solve a problem on the dot product, and I do some manipulations and come to the conclusion that

$\langle x, A^{T} Ax \rangle = \langle x,x \rangle$. $x$ is a column vector with $n$ rows, and $A$ an $n \times n$ matrix.

I know in general that if $a \cdot b = a \cdot c$, then this does not mean that $b = c$. However, from the above concerning the matrix $A$, can I conclude that the only way the left and right hand sides are equal is when $A^{T}A = I$, the identity matrix? Otherwise how can the two quantities be equal?

If the statement were not true, I would be glad if someone could provide a counter example.

Thanks

1 Answers 1

5

$\langle x, A^{T} Ax \rangle = \langle x,x \rangle$ implies $\langle x, (A^{T} A-I)x \rangle = 0$. Now it is true that $\langle x, Tx \rangle = 0$ for all $x$ implies $T=0$ because you can write $4\langle x, Ty \rangle = \langle x+y, T(x+y) \rangle - \langle x-y, T(x-y) \rangle = 0$. (This is a polarization identity.) Taking $x=Ty$ implies $Ty=0$ for all $y$, and so $T=0$.

There's a related question, which you can use once you realize that $\langle x, A^{T} Ay \rangle$ defines an inner product.

  • 0
    BTW, I strongly recommend Axler's [Linear Algebra Done Right](http://linear.axler.net/index.html). The [sample chapters](http://linear.axler.net/LADRSampleChapters.html) are especially relavant to your question.2011-05-14
  • 0
    Hi I've seen the book and the chapters and the bit relevant starts talking about self-adjoint operators and stuff. But what about this: if $\langle x , Tx \rangle = 0$, then this must be true for all $x \in \mathbb{R}^n$. The only way this can be true is if $T$ is the zero matrix. What do you think?2011-05-14
  • 0
    @D Lim, I meant Proposition 7.2 on page 129.2011-05-14