So, (1) is at least "sometimes true", which you can establish by simply producing an example (which you have). The only question is whether it is always true.
Now, from studying least squares you probably saw that $\mathrm{rank}(A^TA) = \mathrm{rank}(A)$. Since $A$ is $3\times 4$ and $A^TA$ is $4\times 4$, then $A^TA$ is not invertible. That means that if you can find a solution to $A^TA\mathbf{x} = A^T\mathbf{z}$ for some specific $\mathbf{z}$, then that equation has infinitely many solutions (take any specific solution $\mathbf{s}$, and for every solution $\mathbf{n}$ to $A^TA\mathbf{x}=\mathbf{0}$, of which there are infinitely many because $A^TA$ is not invertible, you have that $\mathbf{s}+\mathbf{n}$ is a solution. So it really comes down to asking whether $A^TA\mathbf{x} = A^T\mathbf{z}$ always has at least one solution for all $\mathbf{z}$, or whether there is at least one $\mathbf{z}$ for which there are no solutions.
At this point, thinking about least squares, you should either figure out how to argue that there has to be at least one solution (and therefore infinitely many) for all $\mathbf{z}$; or else find an example where it does not have at least one solution. If you can find an example, then the example where it fails and the example where it works shows that it is "sometimes true"; if you can show it always has solutions, then you have your proof that it is "true."
Added. There are two parts to this argument as I sketched it: first you need to show that for every choice of $\vec{z}$, the system $A^TA\vec{x}=A^T\vec{z}$ has at least one solution. For that, what you know about least squares solutions should be useful, so you are on the right track when you realized that this is related to least squares.
Remember: if $\vec{z}$ is any vector, will $A\vec{x}=\vec{z}$ always have a least squares solutions (yes; you already explained why: it's just the vector in the range of $A$ that is closed to $\vec{z}$); and, will a least squares solution always be a solution to $A^TA\vec{x}=A^T\vec{z}$? (You probably proved or saw a theorem about this when you were figuring out how to do least squares; if so, you can just invoke that result).
After proving that for every $\vec{z}$ the system $A^TA\vec{x}=A^T\vec{z}$ always has at least one solution, you need to show that it always has infinitely many solutions. This is equivalent to asking whether $A^TA$ is invertible or not (you should explain why this is so; that is, that if it has more than one solution then $A^TA$ is not invertible, if $A^TA$ is not invertible, then it has more than one solution). Then explain why $A^TA$ cannot be invertible in this case.
(2) Again: you know that this is at least "sometimes true" (for example, $\mathbf{A}=\mathbf{I}$). The question is whether it is always true or not.
What characterizations of orthogonal matrices do you know? I assume it's that $A$ is orthogonal if and only if $A^TA = AA^T = I$?
Added. You are on the right track. You know that $A\vec{u}\cdot A\vec{u}=1$, and therefore that $\vec{u}\cdot(A^TA\vec{u}) = 1$ for all unit vectors $\vec{u}$, and therefore that $\vec{u} \cdot (\vec{u}-A^TA\vec{u}) = 0$ for all unit vectors $\vec{u}$. Now notice that $I - A^TA$ is symmetric; so you can find an orthonormal basis of eigenvectors of $I - A^TA$. If $\vec{u}$ is a unit eigenvector of $I-A^TA$, what does $\vec{u}\cdot(I- A^TA)\vec{u} = 0$ tell you?
(3): Again, you know this is at least "sometimes true". So, you need to determine: Is $T$ linear? Is $T$ one-to-one? Is it onto?
Linearity I'll leave to you. For one-to-one: suppose $T(M)=0$; does it follow that $M=0$? Start playing with $SMS^{-1}=0$ and see what you can conclude.
For surjectivity: given a matrix $N$, can you find an $M$ such that $T(M)=N$?
Added: You've solved (3); it's always true.