Today we were discussing how for an nxn orthogonal projection matrix from $\mathbb{R^{n}}$ onto a subspace W, Ker($A$)=$(Im$A$)^{\perp}$=$W^{\perp}$ and that Ker($A^{T}$) is also $W^{\perp}$. This prompted the question of what conditions are necessary for a matrix so that the kernels of it and its transpose are equal. It looks like it always works when it's an orthogonal projection, but we struggled to find an example of a square matrix for which the aforementioned identities would not hold, and consequently for which Ker($A$)$\neq$Ker($A^{T}$). It looks like we need to find a transformation whose Kernel would not consist only of things perpendicular to its image, but we were wondering if that was possible or if our reasoning was correct at all. Could you please clarify this issue (I know it's convoluted) and try to point out where we were right and wrong, and when do those identities hold?
A confusion about Ker($A$) and Ker($A^{T}$)
-
0By "*the* projection", you clearly mean the *orthogonal* projection... You might want to put it explicitly, though. – 2011-03-11
-
0@Arturo: We actually got this example from the book, where it used projection on W to prove that dimensions of W + W perp are equal to n, but I don't think it mentioned orthogonal projection, though I could be wrong (maybe we are just assumed not to do any other projections at our level, or maybe it was assumed it was a perpendicular projection, which I guess is the same thing). – 2011-03-11
-
0@LiinAlgStudent: You are definitely talking about the orthogonal projection. In general, given a vector space $\mathbf{V}$, if you write $\mathbf{V}=\mathbf{W}\oplus\mathbf{U}$, you get a projection onto $\mathbf{W}$ (which depends on $U$): given any $\mathbf{v}\in\mathbf{V}$ there is a unique $\mathbf{w}\in\mathbf{W}$ and a unique $\mathbf{u}\in\mathbf{U}$ such that $\mathbf{v}=\mathbf{w}+\mathbf{u}$. Map $\mathbf{v}$ to the corresponding $\mathbf{w}$. When $\mathbf{U}=\mathbf{W}^{\perp}$, you get the *orthogonal* projection onto $\mathbf{W}$. – 2011-03-11
3 Answers
Things always work nicely for projections, because unlike other matrices we have $\text{Range}\oplus\text{kernel}=\text{whole space}$ Let $V=\mathbb{R}^2$ Consider $$A=\left[\begin{array}{cc} 0 & 1\\ 0 & 0\end{array}\right]$$
Then $$\text{ker}(A)=\left\{ \left[\begin{array}{c} a\\ 0\end{array}\right]\ :\ a\in\mathbb{R}\right\},$$ but $$A^{t}=\left[\begin{array}{cc} 0 & 0\\ 1 & 0\end{array}\right]$$ so that $$\text{ker}(A^{t})=\left\{ \left[\begin{array}{c} 0\\ a\end{array}\right]\ :\ a\in\mathbb{R}\right\}.$$ These are definitely different. (In contrast to the previous example, their direct sum is the whole space $V$)
-
0Ok, so right off the bat, Ker A is not equal to the perp of the image, so there's an example of a non-projection matrix that suppresses the dimension (for some reason didn't see it). Thanks a lot. – 2011-03-11
$A$ and $A^{T}$ are similar. So the answer should be the same. You can try standard Jordan decomposition and see how the Jordan block works. A few cases in $2\times 2$ would be very helpful.
-
0So Ker(A) is always equal to Ker(A) transpose, do I understand you correctly? Not sure what Jordan block is, we are just beginners. – 2011-03-11
-
0yes - I am also a beginner. I would be glad to know if I were wrong and someone can pointed me out. For Jordan decomposition you can see here: – 2011-03-11
-
0http://en.wikipedia.org/wiki/Jordan_normal_form – 2011-03-11
-
0I will look into that, but it seems like a bold claim -- I think that would be a very important property for any square matrix if its kernel was equal to the kernel of its transpose. – 2011-03-11
-
0Yeah, it seems overly bold. I forgot $PA^{t}P(x)$ could have zeros in $P(x)$. I just feel they does not make much difference for me - they should be isomorphic. – 2011-03-11
-
1@LinAlgStudent: It is not true in general. Take $$A=\left(\begin{array}{cc}0&1\\0&0\end{array}\right), \quad A^T=\left(\begin{array}{cc}0&0\\1&0\end{array}\right).$$ The kernel of $A$ is $\{(x,0)\mid x\in\mathbb{R}\}$, the kernel of $A^T$ is $\{(0,y)\mid y\in\mathbb{R}\}$. – 2011-03-11
-
0@user7887: Yes, they are "isomorphic", but that tells you precious little: it just tells you they have the same dimension. – 2011-03-11
I think the condition you will want to check for the kernel of a matrix to be equal to the kernel of the transpose of the matrix is when the matrix is symmetric.