2
$\begingroup$

$A$ is an $n\times n$ orthogonal matrix. Show that $(A\mathbf x)\cdot (A \mathbf y ) = \mathbf x\cdot\mathbf y$ for all $\mathbf x$ and $\mathbf y\in \mathbb{R}^n$.

I wasn't sure how to treat the $\mathbf x$ and $\mathbf y$ terms, are they just the $(x_1, x_2, ... x_n)$ column vectors?

Also, I tried one thing but it seemed like it was too simple and I couldn't be sure if it was the correct procedure to show the equality is true or not; however, this is what I did:

$(A\mathbf x)^T(A\mathbf x)~\cdot(A\mathbf y) = (A\mathbf x)^T(\mathbf x\cdot\mathbf y)$

$I\cdot (A\mathbf y) = AI\mathbf y$

$A\mathbf y= A\mathbf y$

Is this the correct path, I don't feel like I am understand everything that is going on with this problem. If this is the correct method, could someone please explain what is happening on a deeper level?

  • 1
    @stariz77 Added to my answer to address the "deeper level"2012-06-05

1 Answers 1

3

Just so the question is finally answered:

Use

  • the identity that for column vectors $v$ and $w$, $v\cdot w=v^T w$.
  • the fact that $(XY)^T=Y^T X^T$ for any compatible matrices $X, Y$.
  • the fact that $AA^T=A^T A=I$ in this particular problem

I can also help out with the "deeper level" part of your question. As you may know, the dot product (roughly speaking) measures angles between vectors, and from that you can get lengths of vectors too. When you show that $Ax\cdot Ay=x\cdot y$, you are showing that the transformation $A$ preserves the angles (and hence the lengths) between vectors. It turns out that orthogonal matrices are precisely the matrices which preserve the normal dot product, this way.

This is good intuition for real vector spaces, and while it's not strictly speaking true for complex numbers and other fields, it's still a useful crutch.