6
$\begingroup$

Let $A:E\rightarrow F$ be a Linear Transformation between finite dimensional vector spaces, with $\mathrm{Rank}(A)=r$ and $\dim E=n$, $\dim F=m$. Prove that there are basis in $E$ and $F$ such that the matrix of $A$ has $a_{11}=\cdots=a_{rr}=1$ and $a_{ij}=0$ everywhere else, as entries.

I thought in the change of basis ap=qa' where a' would be the matrix we want but as I got no information about $a$, $p$, $q$ then this is not a way out definetely. Then as the rank is the maximum number of independent columns and rows I thought I could just erase the ones that are linear dependent but this doesn't guarantee me that the transformation would be the same transformation without the deleted linear dependent columns and rows.

A hint would be apreciated, Thanks in advance.

  • 0
    If you want to kill a fly with a sledgehammer, you could apply the Singular Value Decomposition.2011-06-25

3 Answers 3

0

This is a mixed of Javier and Agustí answers, I can take any basis of E, say $\{v_{1},\cdots v_{n}\}$ then look at the images $\{Av_{1},\cdots Av_{n}\}$ and find a basis of the image here with this basis $\{Av_{1},\cdots Av_{r}\}$ propertly ordered is obvious that the members of $\{v_{1},\cdots v_{r}\}$ don't belong to $Ker(A)$ then I can take a basis of the kernel as the dimension of it is DimE-Dim(Im(A))=n-r, let that basis be \{v'_{r+1}\cdots v'_{n}\} then is easy to see that the basis asked for E is \{v_{1},\cdots v_{r},v'_{r+1}\cdots v'_{n}\} and for F the basis would be $\{Av_{1},\cdots Av_{r},w_{r+1}\cdots w_{m}\}$ as I can always complete the basis of the image to a basis of the whole space F.

Thanks both (Javier and Agustí)

  • 0
    indeed, find a basis of the image by expressing $A\cdot v$ as a linear combination of independent vectors of $F$. Then complete it to a basis of $F$. Now, solve the system $A\cdot u=0$ giving you the basis of $\ker A$; complete it to a basis of $E$. Then in those basis $A$ has the required matrix form.2011-06-26
4

Let $\left\{ u_1, \dots , u_n\right\}$ be a basis of $E$. Then, $\left\{ Au_1, \dots , Au_n \right\}$ is a system of generators of the subspace $\mathrm{im} A \subset F$. Whenever you have a system of generators of a vector (sub)space, you can delete some of them in order to obtain a basis. Since $r= \mathrm{rank} A = \mathrm{dim}\ \mathrm{im}A$, you can take $r$ of those $Au_i$ to form a basis of $\mathrm{im}A$. Reordering the original basis $u_1, \dots , u_n$ if necessary, we can assume that these are the first ones. So $Au_1, \dots , Au_r$ are a basis of $\mathrm{im}A$.

Now, you have $r\leq m$ linearly independent vectors $Au_1, \dots , Au_r$ in $F$. You can always complete a set of linearly independent vectors in order to order to obtain a basis of your vector space. So, choose $m-r$ vectors $v_{r+1}, \dots , v_m \in F$ such that $Au_1, \dots , Au_r, v_{r+1}, \dots , v_m$ is a basis of $F$.

And you're done: $\left\{ u_1, \dots , u_n\right\} \subset E$ and $\left\{ Au_1, \dots , Au_r, v_{r+1}, \dots v_m\right\} \subset F$ are bases you were looking for.

EDIT. I'm afraid my answer is wrong. If you perform the steps in it, the matrix you'll obtain looks like

$ \begin{pmatrix} 1 & 0 & \dots & 0 & a^1_{r+1} & \dots & a^1_n \\ 0 & 1 & \dots & 0 & a^2_{r+1} & \dots & a^2_n \\ \vdots & \vdots & \ddots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & 1 & a^r_{r+1} & \dots & a^r_n \\ 0 & 0 & \dots & 0 & 0 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & 0 & 0 & \dots & 0 \end{pmatrix} $

But you have no control on the remaining $a^i_j$.

  • 0
    Yes, but I should re-write all my answer. And you already did it. :-)2011-06-28
3

The columns of the matrix $A$ are the components of the images of the basis vectors of $E$, i.e. $A(u_i)= A\cdot u_i$ for some basis $=E$. Now the null space $\ker A=\{v\in E|\, A(v)=0\in F\}$ is a vector subspace of $E$ and the image space $\text{im} A=\{w\in F\, |\,\exists v\in E, w=A(v)\}$ a subspace of $F$ with both satisfying the Rank-Nullity theorem $\text{rank}(A)+\text{nullity}(A)=n$, that is $ \dim(\text{im}\,A)+\dim(\ker A) = \dim E $

This means $E$ is decomposed into the space of vectors which map to $0$ in $F$ through $A$, that is $\ker A$, and its complementary subspace $E-\ker A=:\widetilde{\text{im}} A < E$, the space of vectors which map to all the nontrivial vectors into the image of $A$, i.e. $E=\widetilde{\text{im}} A\oplus \ker A$. In fact this is just another way of stating the first isomorphism theorem for vector spaces (where $<$ means vector subspace) $ E>\widetilde{\text{im}} A\cong \frac{E}{\ker A}\cong \text{im}\,A With this decomposition of $E$, you choose a basis of $E$ formed by the $r$ basis vectors $\{u_1,...,u_r\}$ of $\widetilde{\text{im}} A$ and the $n-r$ basis vectors $\{u_{r+1},...,u_n\}$ of $\ker A$. In this basis the matrix of $A$ whose columns are the components of the vectors $A(u_i)$, has zeros in the last $n-r$ columns since they are images of vectors in the null space. Finally the first $r$ columns are $A(u_1)...A(u_r)$, but since $A$ has rank $r$ these must be linearly independent and thus a basis of $\text{im}\, A$. Choose this set of $\{A(u_1),...,A(u_r)\}$ as new basis of the image, so that you can complete it to a basis of $F$ adding some $\{v_1,..., v_{m-r}\}\subset F$. Therefore you have arrived to basis of $E$ and $F$ where the decompositions in $F$ of the linear transformation of the $E$ basis are $ \begin{aligned} & A(u_i)=\sum_{j=1}^r \delta_{ij}A(u_j)+\sum_{j=1}^{m-r} 0\cdot v_j\;\;\text{for } u_i\in\widetilde{\text{im}} A \\ & A(u_k)=\sum_{j=1}^r 0\cdot A(u_j)+\sum_{j=1}^{m-r} 0\cdot v_j\;\;\text{for } u_k\in\ker A \end{aligned} $ gives you the matrix of $A$ as $ A=\begin{pmatrix} 1 & 0 & \cdots & 0 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 & 0 & \cdots & 0 \\ \vdots & \vdots & \ddots & 0 & \vdots & \ddots & 0 \\ 0 & 0 & \cdots & 1 & 0 & \cdots & 0 \\ 0 & 0 & \cdots & 0 & 0 & \cdots & 0 \\ \vdots & \vdots & \ddots & 0 & 0 & \ddots & 0 \\ 0 & 0 & \cdots & 0 & 0 & \cdots & 0 \\ \end{pmatrix}= \begin{pmatrix} \textbf{I}_{r\times r} & | & \mathbf{0}_{r\times (m-r)} \\ ----- & | & ----- \\ \mathbf{0}_{(n-r)\times r} & | & \mathbf{0}_{(n-r)\times (m-r)} \\ \end{pmatrix}. $

  • 0
    @missing314: the example you mention has \ker A=<(1,1)>=\text{im} A, but what I say is that \widetilde{\text{im}}A=\mathbb{R}^2-<(1,1)>=<(-1,1)>, i.e. the complementary of $\ker A$ satisfies $\widetilde{\text{im}}A\cong\text{im}A\cong E/\ker A\Rightarrow E=\widetilde{\text{im}}A\oplus\ker A$ which is still true for your example by the isomorphism $\widetilde{\text{im}}A\ni (-x,x)\mapsto (x,x)\in\ker A$. This is because of the special case when $\text{im}A\subseteq\ker A$ for endomorphisms, but for the GENERIC CASE $\text{im}A\cap\ker A=\{0\}$ and thus $\text{im}A=\widetilde{\text{im}}A$.2011-06-25