1
$\begingroup$

Following the question I asked here and is:

Let $P(\lambda)=(\lambda-\lambda_{0})^{r}$where $r$ is a positive integer. Prove that the equation $P(\frac{d}{dt})x(t)=0$ has solutions $t^{i}e^{\lambda_{0}t},i=0,1,\ldots,r-1$

I now wish to prove the solutions are linearly independent.

I have two questions regarding this:

  1. I learned to prove such independence with the Wronskian, but I am having trouble calculating it in (I calculated the derivatives of $e^{\lambda_{0}t},te^{\lambda_{0}t}$ but its getting too hard when it is a greater power of $t$ since I am getting longer and longer expressions). How can I calculate the Wronskian ?

  2. If I think of the vector space that is the smooth real valued functions then it seems that this set (if I take the power of $t$ to be as big as I want, but finite) is linearly independent. did I deduce right ?

I would appriciate any help!

  • 0
    I was unsure what other tags to add, feel free to add tags if you think it is related.2012-06-16

2 Answers 2

2

To show that $t^i e^{\lambda_0t}$ are linearly independent, it suffices to show that $t^i$ are linearly independent.

  • 0
    Why, can you please add an explanation ?2012-06-16
  • 0
    Write the definition of "linearly independent". If you don't know it, look it up. Compare the two cases I mention. You may use the fact that $e^{\lambda_0 t} \ne 0$.2012-06-16
  • 0
    We need to add that the vecotr space of all smooth real valued function have no zero divisors for this ? Edit: do you need it not to be the zero function, or that for all $t$ it doesn't equal to zero ?2012-06-16
  • 0
    "zero divisor" is not a concept for vector space.2012-06-16
  • 0
    Can we say that such a conclusion is right even if there was $t$ such that $e^{\lambda_{0}t}=0$ ? maybe we can say that this is a ring with the pointwise multiplication and then add the argument about zero divisors ?2012-06-16
  • 0
    You "can" do whatever you want, but it is not needed to solve this problem.2012-06-16
  • 0
    I am asking what is a good argument to show this is sufficient, since $e^{\lambda_{0}t}\neq 0$ I can argue that, but I ask if there was $t$ s.t. $e^{\lambda_{0}t}=0$ could we still argue it is sufficient ?2012-06-16
  • 0
    @Belgi Is there an $x$ such that $e^x =0$?.2012-06-16
  • 0
    @PeterTamaroff - no, I ask if an argument could of been made if the function would of had a point such that it will be zero on that point (in other words, how critical is the fact that $e^x$ doesn't have a zero)2012-06-16
  • 0
    @Belgi Tell me if my answer cleared the doubts.2012-06-16
  • 0
    @Begli if there is a point where the function which is a factor of all the terms you are working with is zero and you are trying to do a proof of linear independence, then work away from that point. In practice this has never caused me a problem with the proofs. The thing to remember is that functions are different if they differ at a single point. As you say, this is not a problem here.2012-06-16
2

Since you have some doubts, I'll try to give you a longer answers and maybe clear them.

The assertion that the $n$ functions $f_k(t)=t^k e^{\lambda_0 t} \text{ ; }{k=0,1,\dots,n-1}$ are linearly independent is that if

$$\sum_{k=0}^{n-1} c_k f_k(t)= e^{\lambda_0 t}(c_0+c_1 t +c_2 t^2+\cdots+c_{n-1} t^{n-1})=0$$

then

$$c_0=c_1=\cdots=c_{n-1}=0$$

Since $e^{\lambda_0 t}\neq 0$ for any $t$, it suffices to prove that if

$$c_0+c_1 t +c_2 t^2+\cdots+c_{n-1} t^{n-1}=0$$

then $$c_0=c_1=\cdots=c_{n-1}=0$$ or that the Wronskian determinant of the $n$ functions, $p_k(t)=t^k \text{ ; }{k=0,1,\dots,n-1}$ is never zero $0$.

For example, for the case $n=3$, we have the functions

$$y_0(t)=c_0$$ $$y_1(t)=c_1 t$$ $$y_2(t)=c_2 t^2$$

The Wronskian determinant is

$$W(y_1,y_3,y_3)=\begin{vmatrix} {1}& { t} &{ t^2} \\ {0}& {1} &{2 t} \\ {0}& {0} &{2 } \end{vmatrix}=2 \cdot 1 \cdot 1 = 2! 1!$$

since all other combinations will occurr with a $0$.

You can try and prove the Wronskian determinant

$$\begin{vmatrix} 1 & t & \cdots & {{t^{n - 2}}} & {{t^{n - 1}}} \\ 0 & 1 & \cdots & {\left( {n - 2} \right){t^{n - 3}}} & {\left( {n - 1} \right)} \\ \vdots & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & 0 & {\left( {n - 2} \right)!} & {\left( {n - 1} \right)!t} \\ 0 & 0 & 0 & 0 & {\left( {n - 1} \right)!} \end{vmatrix}$$

will be equal to $1! 2! 3! \cdots (n-1)!$ so it cannot be zero.

Alternatively, one can prove the result by induction. You can assume the result is proven for $1,2,\dots, n-2$ and show the result is true for $n-1$. I think it is much easier to use the Wronskian determinant.

  • 0
    The Wronskian is not required here actually, the linear independence follows just from the fact that the polynomial has no more than $n-1$ roots.2012-06-16
  • 0
    @Artem What do you mean by "required"? The Wronskian is a fair argument, as valid as you argument. I think saying "just from the fact" is rather disrespectful to the Fundamental Theorem of Algebra. Now, if you can adjoin a proof of that, we'll be cool. =D2012-06-16
  • 0
    What I meant was that the notion of the Wronskian is far less "obvious" (known, etc) than the "common wisdom" fact about the roots of a polynomial.2012-06-16
  • 0
    @Artem Sure. The OP explicitly asked for the use of the Wronskian, so I used that.2012-06-16