2
$\begingroup$

Let $P_3$ be the set of all real polynomials of degree 3 or less. This set forms a real vector space. Show that $\{2x^3+x+1,x−2,x^3−x^2\}$ is a linearly independent set, and find a basis for $P_3$ which includes these three polynomials.


Linear independence is easy, I just put the coefficients in a set and rref;

$\begin{bmatrix}2&1&1\\0&1&-2\\1&-1&0\end{bmatrix} => rref =>\begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}$

So I've proven that the set is linearly independent. Also I have 3 pivots for 3 variables, so think I can write the vector space $P_3$ as a span of the polynomials (or, vectors of the coefficients);

$P_3 = span\{(2, 1, 1),(0, 1,-2),(1, -1, 0)\}$

Since a basis is simply a set of linearly independent vectors which cover the span, are these three vectors a basis for $P_3$, or do I have more work to do? (EDIT: That is, if we pretend that I hadn't completely erased an entire degree from the polynomials).

Also, given that $P_3$ is talked about in terms of polynomials (and assuming I've done the above properly); should I answer questions like this using the polynomials themselves, or is it okay to simply use vectors/matrices/etc with the coefficients?

  • 0
    There is something wrong in your proof of independence. Check the degrees: you can't have coefficients $(2,1,1)$ for the first and at the same time $(1,-1,0)$ for the last (same degree must go in the same "place"). Correct coefficients would be $\{(2,0,1,1),(0,0,1,-2),(1,-2,0,0)\}$.2017-02-04
  • 0
    Yep, I see exactly what I've done. I've basically counted $x^1$ and $x^2$ as the same thing smh :S2017-02-04

2 Answers 2

1

Hint: how would one write the the polynomial of degree $n<1$ in your space?

The polynomial of degree $0$ is of the form $c \ x^0$. Meaning it's just a constant. This is how your matrix should look like initially. From top to bottom coefficients for $x^3,x^2,x^1,x^0$. Your vectors should have 4 components not 3. \begin{bmatrix} 2&0 &1 \\ 0&0 &-1 \\ 1&1&0 \\ 1&-2 &0 \end{bmatrix}

Oh answering your question about the polynomials or the matrices. We use the matrices because it's easier to show linear independence with them. Showing independence is equivalent to showing $A\vec x=\vec 0$ $\Rightarrow \vec x=\vec 0$. It would be hard to use polynomials themselves to show that.

The columns of the following matrix would form a basis since they are linearly independent. \begin{bmatrix} 2&0 &1 &1 \\ 0&0 &-1& 0\\ 1&1&0 & 0\\ 1&-2 &0 &0 \end{bmatrix}

  • 0
    Ah. WHOOPS. I think my brain might be melting...2017-02-04
1

Comment: One basis for $P_3$ is $$\{1,x,x^2,x^3\}$$ so this should tell you your set of polynomials couldn't possibly be a basis (too few elements--all bases have the same cardinality).

I have to wonder too if you really understand linear independence. If I were grading this question on a test, I would give no credit for your "proof". You should start explicitly with the hypothesis that $$a_1p_1+a_2p_2+a_3p_3=0$$ and show that this is equivalent to $$a=0,b=0,c=0$$ (here the $p_i$ are the polynomials you list).

  • 0
    But this is not part of the question, actually.2017-02-04
  • 0
    @Jean-ClaudeArbaut : Then I will amend this to a comment2017-02-04
  • 0
    While I messed up the elements and dimensions of the initial matrix (by erasing a degree of the polynomial), what I had could be row-reduced to the identity matrix. Does that not imply linear independence in the rows of the original matrix?2017-02-05
  • 0
    Yes but the original matrix didn't represent the polynomials you wanted. But yes if a matrix could be reduced to the identity then it has full rank.2017-02-05