0
$\begingroup$

If I have a matrix $(A)$ and I want to find the basis for $ C(A) $ (column space)

I first find the $rref(A)$. Is it true that the columns of $(A)$ corresponding to the pivot columns of $rref(A)$ form the basis of $C(A)$?

And if so, please explain why this is true. Why, are the columns of $(A)$ corresponding the the pivot columns of $rref(A)$ linearly independent just because the pivot columns are independent?

ps- Please don't use advanced Linear Algebra in your answer, I am a beginner (obviously).

  • 0
    I don't know if I studied everything in the 'normal' order, but I know everything up to the null space and rank and such.2012-10-11

1 Answers 1

1

Assuming that elementary matrices are not too advanced, here is an answer. Technically it's possible to show the result using only row reductions, but it is rather tedious and not nearly as clear.

Suppose that we have a matrix $A$ with reduced row echelon form $R$. Then there is a series of elementary matrices which bring $A$ to $R$. Let $E$ be their product so that $EA=R$ Now if we write the matrix column-wise, we have $E\begin{pmatrix}\mathbf{a_1} & \cdots & \mathbf{a_n}\end{pmatrix} = \begin{pmatrix}E\mathbf{a_1} & \cdots & E\mathbf{a_n}\end{pmatrix} =\begin{pmatrix}\mathbf{r_1} & \cdots & \mathbf{r_n}\end{pmatrix}$ where $\mathbf{a_i}$ and $\mathbf{r_i}$ are the columns of $A$ and $R$ respectively. From this we get the equality $E\mathbf{a_i} = \mathbf{r_i}$ for $1\le i \le n$. Now consider the equation $c_1\mathbf{r_1} + \cdots + c_n\mathbf{r_n} = \mathbf{0}$ for scalars $c_i$. We rewrite this as $c_1\left(E\mathbf{a_1}\right) + \cdots + c_n\left(E\mathbf{a_n}\right) = E\left(c_1\mathbf{a_1} + \cdots + c_n\mathbf{a_n}\right)=\mathbf{0}$ since $E$ is a product of elementary matrices, it is invertible. In particular this implies $c_1\mathbf{a_1} + \cdots + c_n\mathbf{a_n}\iff c_1\mathbf{r_1} + \cdots + c_n\mathbf{r_n} = \mathbf{0}$ And it's easy to see that this holds for any subset of the columns. What this means is that elementary row operations preserves linear relations between columns. If a set of columns vectors are linearly independent in $R$ then they will remain linearly independent after transformation $R$ to $A$ via elementary row operations. Likewise, if a subset of the columns of $R$, say $\left\{\mathbf{r_{k_1}},\ \cdots,\ \mathbf{r_{k_\ell}}\right\}$ span the columnspace of $R$, then each $\mathbf{r_i}$ can be written as a linear combination of the set, say $c_1\mathbf{r_{k_1}} + \cdots + c_{\ell}\mathbf{r_{k_\ell}} = \mathbf{r_i}$ then correspondingly, we will have $c_1\mathbf{a_{k_1}} + \cdots + c_{\ell}\mathbf{a_{k_\ell}} = \mathbf{a_i}$ so that the corresponding columns of $A$ remain a spanning set.