4
$\begingroup$

Could I please get some help with the following true/sometimes true/false questions?

1) If $A$ is 3-by-4, then for every $\vec{z}$ $\in$ $\mathbb{R^{3}}$ there are infinitely many solutions $\vec{x}^{*}$ to $A^{T}A\vec{x}^{*}$=$A^{T}\vec{z}$

This is related to least-squares, I can see how it could be sometimes true, for example if A is the zero matrix. But I don't know how to provide a more general answer, can you explain the reasoning in this?

I am not sure if this is what I am looking for, but I think we said that $\mathbb{R^{n}}$ can always be decomposed into a subspace and its perp. So could I argue that since I am decomposing z, and its in $\mathbb{R^{n}}$, it will always be a combination of basis vectors of Im A and its perp?

2) $A$ is n-by-n and magnitude of $A\vec{u}$=1 for all unit vectors $\vec{u}$. $A$ is orthogonal.

Frin what is given to me I know that $A\vec{u}$ dot $A\vec{u}$=1 so $u^{T}A^{T}Au$=1. Not sure how to go from here.

3) If $S$ is 2-by-2 and invertible, and $T$: $\mathbb{R^{2x2}}\to\mathbb{R^{2x2}}$ is represented as $T(M)=S^{-1}MS$ then $T$ is an isomorphism.

We can prove that M=0 if T(M)=0 by taking $S^{-1}$ from the left of $SMS^{-1}$=0 and then $S$ from the right to get M=0. That's how any matrix N can be constructed from M, too => $M=S^{-1}NS$

4) A product of a matrix times its transpose is always symmetric.

Figured this one out -- always true.

2 Answers 2

2

So, (1) is at least "sometimes true", which you can establish by simply producing an example (which you have). The only question is whether it is always true.

Now, from studying least squares you probably saw that $\mathrm{rank}(A^TA) = \mathrm{rank}(A)$. Since $A$ is $3\times 4$ and $A^TA$ is $4\times 4$, then $A^TA$ is not invertible. That means that if you can find a solution to $A^TA\mathbf{x} = A^T\mathbf{z}$ for some specific $\mathbf{z}$, then that equation has infinitely many solutions (take any specific solution $\mathbf{s}$, and for every solution $\mathbf{n}$ to $A^TA\mathbf{x}=\mathbf{0}$, of which there are infinitely many because $A^TA$ is not invertible, you have that $\mathbf{s}+\mathbf{n}$ is a solution. So it really comes down to asking whether $A^TA\mathbf{x} = A^T\mathbf{z}$ always has at least one solution for all $\mathbf{z}$, or whether there is at least one $\mathbf{z}$ for which there are no solutions.

At this point, thinking about least squares, you should either figure out how to argue that there has to be at least one solution (and therefore infinitely many) for all $\mathbf{z}$; or else find an example where it does not have at least one solution. If you can find an example, then the example where it fails and the example where it works shows that it is "sometimes true"; if you can show it always has solutions, then you have your proof that it is "true."

Added. There are two parts to this argument as I sketched it: first you need to show that for every choice of $\vec{z}$, the system $A^TA\vec{x}=A^T\vec{z}$ has at least one solution. For that, what you know about least squares solutions should be useful, so you are on the right track when you realized that this is related to least squares.

Remember: if $\vec{z}$ is any vector, will $A\vec{x}=\vec{z}$ always have a least squares solutions (yes; you already explained why: it's just the vector in the range of $A$ that is closed to $\vec{z}$); and, will a least squares solution always be a solution to $A^TA\vec{x}=A^T\vec{z}$? (You probably proved or saw a theorem about this when you were figuring out how to do least squares; if so, you can just invoke that result).

After proving that for every $\vec{z}$ the system $A^TA\vec{x}=A^T\vec{z}$ always has at least one solution, you need to show that it always has infinitely many solutions. This is equivalent to asking whether $A^TA$ is invertible or not (you should explain why this is so; that is, that if it has more than one solution then $A^TA$ is not invertible, if $A^TA$ is not invertible, then it has more than one solution). Then explain why $A^TA$ cannot be invertible in this case.

(2) Again: you know that this is at least "sometimes true" (for example, $\mathbf{A}=\mathbf{I}$). The question is whether it is always true or not.

What characterizations of orthogonal matrices do you know? I assume it's that $A$ is orthogonal if and only if $A^TA = AA^T = I$?

Added. You are on the right track. You know that $A\vec{u}\cdot A\vec{u}=1$, and therefore that $\vec{u}\cdot(A^TA\vec{u}) = 1$ for all unit vectors $\vec{u}$, and therefore that $\vec{u} \cdot (\vec{u}-A^TA\vec{u}) = 0$ for all unit vectors $\vec{u}$. Now notice that $I - A^TA$ is symmetric; so you can find an orthonormal basis of eigenvectors of $I - A^TA$. If $\vec{u}$ is a unit eigenvector of $I-A^TA$, what does $\vec{u}\cdot(I- A^TA)\vec{u} = 0$ tell you?

(3): Again, you know this is at least "sometimes true". So, you need to determine: Is $T$ linear? Is $T$ one-to-one? Is it onto?

Linearity I'll leave to you. For one-to-one: suppose $T(M)=0$; does it follow that $M=0$? Start playing with $SMS^{-1}=0$ and see what you can conclude.

For surjectivity: given a matrix $N$, can you find an $M$ such that $T(M)=N$?

Added: You've solved (3); it's always true.

  • 0
    @Jared: No. For one thing, \left(\begin{array}{cc}0&1\\1&0\end{array}\right) is not the identity, is orthogonal, and is symmetric. And so is $-I$. Among others. They are pretty restricted, but not quite *that* restricted.2011-03-12
0

For question 4, write down the matrix product in summation form. It can be found here: http://en.wikipedia.org/wiki/Matrix_multiplication. Then, set one matrix to the transpose of the other. It can be found here: http://en.wikipedia.org/wiki/Transpose. Then, you will notice in the summation that a*b = b*a, thus you will find $B = A*A^T \iff B_{ij} = B_{ji} \iff B = B^T$

  • 1
    Thanks, I have figured out that $(AA^{T})^{T}=(A^{{T}^{T}}A^{T})=(AA^{T})$, so they are symmetric.2011-03-12