0
$\begingroup$

If $e = (e_1,e_2,...,e_n)$ is the basis of $\mathbb R^n$ and $P$ is a random invertible $n \times n$ matrix, proof that when $f_j=Pe_j (j=1,...,n)$ then $f = (f_1,f_2,...,f_n)$ is another basis of $\mathbb R^n$


What I have found:

Let's say $ P = \begin{pmatrix} p_{11} & p_{12} & ... & p_{1n}\\ p_{21} & p_{22} & ... & p_{2n}\\ ... & ... & ... & ...\\ p_{n1} & p_{n2} & ... & p_{nn}\\ \end{pmatrix} , e_j = \begin{pmatrix} e_{1j}\\ e_{2j}\\ ...\\ e_{nj}\\ \end{pmatrix}, $ and $ f_j = \begin{pmatrix} f_{1j}\\ f_{2j}\\ ...\\ f_{nj}\\ \end{pmatrix} $ then $f_j=Pe_j= \begin{pmatrix} p_{11} & p_{12} & ... & p_{1n}\\ p_{21} & p_{22} & ... & p_{2n}\\ ... & ... & ... & ...\\ p_{n1} & p_{n2} & ... & p_{nn}\\ \end{pmatrix} e_j= \begin{pmatrix} p_{11} \cdot e_{1j} + p_{12} \cdot e_{2j} + ... + p_{1n} \cdot e_{nj}\\ p_{21} \cdot e_{1j} + p_{22} \cdot e_{2j} + ... + p_{2n} \cdot e_{nj}\\ ... \\ p_{n1} \cdot e_{1j} + p_{n2} \cdot e_{2j} + ... + p_{nn} \cdot e_{nj}\\ \end{pmatrix} = \begin{pmatrix} f_{1j}\\ f_{2j}\\ ...\\ f_{nj}\\ \end{pmatrix}$

thus, we know that every element of $f_j$ is linearly independent.

But then, how do I know whether $f$ is linearly independent and span $\mathbb R^n$?

  • 0
    Hint: You don't need to break down into components. Just use the definition of "linearly independent" and multiply by $P^{-1}$2017-02-02

2 Answers 2

0

Let $x\in\Bbb R^n$. Then

$$x=\left(PP^{-1}\right)x=P\underbrace{\left(P^{-1}x\right)}_{=:y}$$

Since the $e_i$ form a basis, we can write $y=\sum_{i=1}^ny_ie_i$, and thus

$$x=P\sum_{i=1}^ny_ie_i = \sum_{i=1}^nP(y_ie_i) = \sum_{i=1}^ny_i(Pe_i)= \sum_{i=1}^ny_if_i\text.$$

This proves that $\{f_i\}$ form generating set of $\Bbb R^n$.

For linear independence: Let $t_i\in\Bbb R$ with $\sum_{i=1}^nt_if_i=0$. Then

$$0 = \sum_{i=1}^nt_if_i = \sum_{i=1}^nt_iPe_i = P\sum_{i=1}^nt_ie_i$$

Multiplying by $P^{-1}$ yields

$$\underbrace{P^{-1}0}_{=0} = P^{-1}P\sum_{i=1}^nt_ie_i = \sum_{i=1}^nt_ie_i$$

and hence $t_i=0, i = 1\dots n$ since the $e_i$ are linearily independent.

0

Let $E$ the matrix with columns $e_j$ and $F$ the matrix with columns $f_j$. We have: $$ PE=F $$ and, $\det (F)=\det(P)\det(E) \ne 0$ since $P$ and $E$ are invertible, so the columns of $F$ are linearly independent.