1
$\begingroup$

I understand that if A is orthogonal, then $A^{-1} = A^T$. However, i cannot seem to understand the proof.

Can someone put in some numbers in the proof. This site tries to explain it but i cannot seem to get $A^TA$ becoming an identity matrix.

https://dyinglovegrape.wordpress.com/2010/11/30/the-inverse-of-an-orthogonal-matrix-is-its-transpose/

Thanks.

  • 0
    @Giulio Can i have some numbers inside the matrix to prove it. If all the numbers in the orthonormal vectors are positive, i cannot seem to see how a zero would pop up from2017-02-15
  • 0
    an orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors. So you write $A = [A_1, \ldots , A_n]$ then $A_iA^T_j = \delta_{ij}$. And then $A^T A = A A^T = I_n$.2017-02-15
  • 0
    @aceminer but you can't make an orthogonal matrix with just positive numbers2017-02-15
  • 0
    @Omnomnomnom Why so?2017-02-15
  • 0
    @aceminer because two vectors with positive entries can't have a dot product of $0$.2017-02-15
  • 0
    @Omnomnomnom Yes, thats my question. But consider this matrix [[1 2] [2 3]]. The rows and columns are linearly independent2017-02-15
  • 0
    *"Yes, that's my question":* What? Very confused here. Are you under the impression that any matrix with *linearly independent* columns is orthogonal? Are you under the impression that any matrix with orthogonal columns is orthogonal? Neither of these are true. In order for a matrix to be orthogonal, its columns must be orthogonal and of length $1$.2017-02-15
  • 0
    @Omnomnomnom. A unit vector can have all positive numbers in their columns. 2 unit vectors can be orthogonal to each other and both have positive values. For example [1/sqrt(2) 1/sqrt(2)] and [2/sqrt(13), 3/sqrt(13)] are both orthogonal vectors and both with length 12017-02-15
  • 0
    @aceminer but their dot-product is not zero. What makes you think that these vectors are orthogonal?2017-02-15
  • 0
    @aceminer Now I understand! You misunderstood the meaning of "orthogonal". We say that a pair of vectors is *orthogonal* if they are "perpendicular", which is to say that they have dot product zero. A list of vectors is orthogonal if each pair has dot product zero. A list is orthonormal if, in addition to being orthogonal, each vector has length $1$. A list of unit vectors is "normal" perhaps, but it is not orthogonal, and not orthonormal.2017-02-15
  • 0
    @Omnomnomnom yes thank you for the explanation. I cleared up my misconception now.2017-02-15

2 Answers 2

2

Suppose $A$ an orthogonal matrix. Then we can write $A$ as $A = (a^1,a^2,...,a^n)$. With $a^i$ the column vectors of $A$.

Then we know that $a^1,a^2,...,a^n$ are pairwise orthogonal. E.g. $(a^i,a^j) = 0$ with $i≠j$ and $(a^i,a^j)= 1$ with $i=j$.

Now if you do the matrix multiplication $A^TA$ or $AA^T$ only the positions $a^{i,i}$ become ones, all others zeros.

1

Let $A$ be an orthogonal matrix. Let us denote $A$ using columns: $$A = \begin{pmatrix} e_1 & e_2 & \ldots & e_n \end{pmatrix}$$ where the columns $e_1, e_2, \ldots, e_n$ are orhtonormal. This means that $e_i \cdot e_j = \delta_{ij}$, where $\cdot$ denotes the inproduct and $$\delta_{ij} = \begin{cases} 1 &\text{if } i = j\\ 0 &\text{otherwise} \end{cases}$$ Note that $A^{t}$, the transposed matrix is equal to $$\begin{pmatrix} e_1^{t} \\ e_2^{t}\\ \ldots \\ e_n^{t} \end{pmatrix}$$ so $e_i^t$ is a rowvector. If we now compute the following product $$A^tA = \begin{pmatrix} e_1^{t} \\ e_2^{t}\\ \ldots \\ e_n^{t} \end{pmatrix}$$ we find $$\begin{pmatrix} e_1^t e_1 & e_1^te_2 & \ldots &e_1^te_n\\ e_2^te_1 & e_2^te_2 & \ldots &e_2^te_n\\ \vdots & \vdots & \ldots & \vdots\\ e_n^te_1 & e_n^te_2 & \ldots &e_n^te_n \end{pmatrix} = \begin{pmatrix} e_1 \cdot e_1 & e_1 \cdot e_2 & \ldots &e_1 \cdot e_n\\ e_2\cdot e_1 & e_2\cdot e_2 & \ldots &e_2\cdot e_n\\ \vdots & \vdots & \ldots & \vdots\\ e_n\cdot e_1 & e_n\cdot e_2 & \ldots & e_n\cdot e_n \end{pmatrix}$$

which is the identity matrix. Therefore we have that (by uniqueness of the inverse matrix) that $A^t = A^{-1}$.

  • 0
    It is not just uniqueness of the inverse matrix needed, also existence of right inverse if left inverse exists (and this is only true in finite dimensional vector spaces)2017-02-15
  • 0
    Same reasoning,should have made that clear2017-02-15
  • 0
    its not same reasoning. or what do u mean by that?2017-02-15
  • 0
    Both rows and columns form a orthonormal set, so you can do the same trick denoting $A$ in rowversion to show that $AA^t = I$.2017-02-15
  • 0
    yes. both rows and columns form orthonormal sets. But that should be the result of this proof not the premise. In the link of the OP it is defined that all column vectors are orthonormal, thats it. Of course it follws that the row vectors are orthonormal as well but only through reasoning indicazed by my comments.2017-02-16
  • 0
    Oh, sorry,thought both were included in the definition of orthogonal matrix.2017-02-16