3
$\begingroup$

I'm trying some old exam questions to prepare for my linear algebra exam and there is the following question, which I can't figure out.

Given the following linear transformation $$ L: \mathbb{R}^4 \rightarrow \mathbb{R}^4 : (x,y,z,w) \mapsto (2x,2y-w,-2x+4y-2w,5z+5w)$$ Find a basis $\alpha$ and a basis $\beta$, both in $\mathbb{R}^4$, such that $$L_\alpha^\beta =\begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \\ \end{pmatrix} $$ where $L_\alpha^\beta$ is the matrix representation of L with respect to bases $\alpha$ and $\beta$.

I'm not really sure how to begin with this problem. If we take a random vector $(a,b,c,d)$ in $\mathbb{R^4}$ and multiply with $L_\alpha^\beta$ , we get the following : $$L_\alpha^\beta \cdot \begin{pmatrix} a \\ b \\ c \\ d \end{pmatrix} = (a, 0, c, d) ,$$

which means that all vectors in the basis $\beta$ will be of that form. Is that even possible? How can I express a vector $(0,1,0,0) \in \mathbb{R^4}$ with respect to that basis? Am I completely misunderstanding something?

Can someone point me in the right direction?

Thank you.

EDIT: I think I've found $v_2 = (0,\frac{1}{2},-1,1)$ but I'm still no further. I hope edit might bump the post and someone will be able to help me.

2 Answers 2

2

The $i$-th column of the matrix $L_\alpha^\beta$ contains the image of $\alpha_i$ with respect to the basis $\beta$. Since the second column is the zero vector, it is the image of an element of the null space (or kernel).

Based on your edit, I believe you were already looking in this direction. I'd take $(0,1,-2,2)$ but yours works too. Since you want its image in the second column, you take this vector as $\alpha_2$. Extend $\alpha$ to a complete basis of $\mathbb{R^4}$, e.g. by adding standard basis vectors: $$\alpha = \left\{ \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 1 \\ -2 \\ 2 \end{pmatrix}, \begin{pmatrix} 0 \\ 1 \\ 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 1 \\ 0 \end{pmatrix} \right\}$$ Now the trick to the simple form of $L_\alpha^\beta$ is to choose the basis vectors of $\beta$ carefully: use the images of the $\alpha$'s! That is, pick: $\beta_j = L(\alpha_j)$ for $j=1,3,4$. Not for $j=2$, because by our choice of $\alpha_2$, we have $L(\alpha_2) = 0$. Extend $\beta$ to a basis of $\mathbb{R^4}$ by picking, for example, $\beta_2 = (0,0,1,0)^T$.


For $\beta_1$, we find the image of $\alpha_1$: $$\beta_1 = L(\alpha_1) = L \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix} =\begin{pmatrix} 2 \\ 0 \\ -2 \\ 0 \end{pmatrix} $$ Repeat this for $\beta_3$ and $\beta_4$ to get: $$\beta = \left\{ \begin{pmatrix} 2 \\ 0 \\ -2 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 2 \\ 4 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 0 \\ 5 \end{pmatrix} \right\}$$


Notice that other choices are possible: you fix $\alpha_2$ and then complete $\alpha$ to a basis by adding three linearly independent vectors. You then choose the $\beta$'s as the images of these three basis vectors you added to $\alpha$ to obtain the given, simple form of the transformation matrix. These three will span the image of $L$ and you simply extend $\beta$ to a basis by adding any linearly independent vector.

  • 0
    Wow, thanks a lot! This is really helpful. I really hate this kind of questions because to me this looks a lot like guessing. Picking vectors of the default basis to complete $\alpha$, choosing another vector $\beta_2$ to complete the basis $\beta$, this looks like guessing work to me2017-01-06
  • 1
    It's not _guessing_, it's _choosing_ (but it's important to realize when and why you can choose). And if you can choose, don't complicate things for yourself. You could take an uglier basis $\alpha$ (as long as you keep $\alpha_2$): it would still work as long as you take the images of these $\alpha$'s as $\beta$'s. Once you realize this, why pick annoying $\alpha$'s? As for $\beta_2$, you don't "need" this vector because the other three already span the image. But you do need $\beta$ to be a basis, so...2017-01-06
0

I think you could use singular value decomposition here. The computations would likely be easier in matrix form though since I'm not sure what the adjoint would be here of L. Anyone have any thoughts on SVD for this?