8
$\begingroup$

Consider a general $n$th order linear equation $$x^{n}(t)+a_{n-1}x^{n-1}(t)+ \dots + a_{1}x'(t) + a_{0}x(t)=0\tag{$*$}.$$ Let $x_1, x_2 , \dots , x_n$ be a fundamental set of solutions of above and set $W(t)=W(x_1, x_2 , \dots , x_n ; t).$

Question. Show that a set of solutions $x_1 , x_2 , \dots , x_k$ of $(*)$ are linearly independent over $(-\infty, \infty)$ if and only if their Wronskian $W(x_1 , x_2 , \dots , x_k; t_0) \neq 0$ for some $t_0 \in (-\infty, \infty).$ Also show that those solutions form a vector space of dimension $n$.

My approach: Writing the equivalent first order system,
$$y_1=x ,~y_2=x' ,~\dots~,y_n=x^{(n-1)},$$ from which we get $$y_1'=y_2,~~y_2'=y_3,~~\dots~~,y_{n-1}'=y_n,~~y_n'=-a_{n-1}(t) y_n- \cdots - a_{1}(t) y_2-a_{0}(t) y_1.$$

For the contrapositive statement: i.e., if $W(x_1 , x_2 , \dots , x_k; t_0) = 0,$ for some $t_0 \in (-\infty, \infty),$ doesn't that clearly implies that the set of vectors $\{ x_1 , x_2 , \dots , x_k \}$ is linearly dependent.

I'm stuck in progressing any further. Any help in proving this is much appreciated.

  • 0
    You got the contrapositive wrong: it must be **for all times $t$** and not * for some *$t_0$2017-01-12

3 Answers 3

1

Only a remark for the contrapositive: the fact that the Wronskian is zero is a necessary but not sufficient condition for the linear dependence of the functions considered. As an example, take $x_1(t)=t^2$ and $x_2(t)=t|t|$. Then, for any $t \in \mathbb{R}$ the Wronskian $W(x_1,x_2;t)=0$, but $x_1(t)$ and $X_2(t)$ are linearly independent. As a matter of fact, consider the equation $c_1t^2+c_2t|t|=0$ and note what happens if you take $t=1$ and then $t=-1$. This happens since the previous functions $x_1(t)$ and $x_2(t)$ are solutions of no differential equation.

In order to achieve the proof you need, you have to observe that if the fundamental solutions of a $n$th order linear equation are linearly dependent, then also $x_1'(t),\dots,x_n'(t)$ are. To show it, take the linear combination $c_1x_1(t)+\dots+c_nx_n(t)=0$ and derive it. The converse is obvious. Thus, if you take the vectors $(x_j^(i)(t))$, where $i$ is the derivation order ($i=0,\dots,n-1$ and $j=1,\dots,n$ is the index of the solutions), and put them into a matrix, the wronskian. You conclude that the wronskian is non-zero for any $t \in \mathbb{R}$ if and only if the columns are linearly independent.

Now, we can finish the proof recalling Liouville's Theorem: the wronskian of the solutions of a system of linear differential equations $x'=A(t)x$ satisfies the differential equation $\bf{W}'=tr(A(t))W$ (in your case, by defining $y_i$ as you did, you obtain a particular form for the matrix A). This is true, since for any interval $(\tau,\tau+\epsilon)$, we have

$$ \bf{W}(\tau+\epsilon)=W(\tau)+\epsilon W'(\tau)+o(\epsilon)=(I_n+\epsilon A(\tau))W(\tau)+o(\epsilon) $$

By Binet, recalling that $det(I_n+\epsilon A)=1+\epsilon tr(A)+O(\epsilon^2)$ (observe that the eigenvalues of $I_n+\epsilon A$ are $1+\epsilon \lambda_i$, where $\lambda_i$ are the eigenvalues of A), you have that

$$ W(\tau+\epsilon)=det(I_n+\epsilon tr(A(\tau))+o(\epsilon) $$ by which it follows that

$$ \frac{W(\tau+\epsilon)-W(\tau)}{\epsilon}=tr(A(\tau))W(\tau)+o(1) $$ that for $\epsilon \to 0$ satisfies the thesis.

Liouville's theorem implies that $W(t)=W(\tau)exp(\int_\tau^t tr(A(s)) ds)$, so the Wronskian could be identically zero or non-zero everywhere. This concludes the proof.

1

First let us prove that it is a vector space. Since for two solutions $x_1$ and $x_2$ of the above equation we have \begin{align*}&\left(\frac{d^n}{dt^n}+a_{n-1}\frac{d^{n-1}}{dt^{n-1}}+\cdots+a_0\right)(cx_1+x_2)\\=&c\left(\frac{d^nx_1}{dt^n}+a_{n-1}\frac{d^{n-1}x_1}{dt^{n-1}}+\cdots+a_0x_1\right)+\left(\frac{d^nx_2}{dt^n}+a_{n-1}\frac{d^{n-1}x_2}{dt^{n-1}}+\cdots+a_0x_2\right)\\=&0+0\\=&0\end{align*} Hence it is a vector space. Now assume $W\neq0$ for some point $t_0$. Consider the Linear combination $$c_1x_1+c_2x_2+\cdots+c_nx_n=0$$ then we have \begin{align*} &c_1x_1'+c_2x_2'+\cdots+c_nx_n'=0\\ &c_1x_1^{''}+c_2x_2^{''}+\cdots+c_nx_n^{''}=0\\ &.\\ &.\\ &c_1x_1^n+c_2x_2^n+\cdots+c_nx_n^n=0~~~~~\text{for all t} \end{align*}Also at $t_0$ this equation holds and in matrix form we can write this as $$W(t_0)C=\mathbb{0},~~~ \text{where } C=[c_1~c_2~\cdots c_n]^T $$ but at $t_0$, $W(t_0)\neq0$ and hence $C=0.$ Which shows that ${x_1,x_2\cdots,x_n}$ are LI and since there can at most n LI solutions so the dimension of this vector space is n.

1

I'll assume all of the coefficients $a_k$ are continuous on the interval of interest $I$.

Theorem: [Existence and Uniqueness] Let $a_0,a_1,\cdots,a_{n-1}$ be fixed constants, and let $t_0$ be a point in the interval of interest $I$ for the ordinary differential equation stated in your problem. Then there exists a unique solution $x(t)$ defined on $I$ such that $$ x^{(k)}(t_0)=a_k,\;\;\; k=0,1,2,\cdots,n-1. $$ Proof: Use Picard iteration to establish existence and uniqueness.

Theorem: [Wronskii] Let $\{ x_1,x_2,\cdots,x_n \}$ be a set of solutions of the ODE. The Wronskian $W(x_1,x_2,\cdots,x_n)$ vanishes at some point $t_0$ of the interval of interest $I$ iff $\{ x_1,x_2,\cdots,x_n \}$ is a linearly dependent set of functions on $I$.

Proof: If the set of functions $\{ x_1,x_2,\cdot,x_n \}$ is a linearly dependent set of functions on $I$, then there are constants $\alpha_1,\alpha_2,\cdots,\alpha_n$ such that $$ \alpha_1 x_1(t)+\alpha_2 x_2(t)+\cdots +\alpha_n x_n(t) = 0,\;\;\; t\in I. $$ By differentiating this equation $n-1$ times, one obtains the matrix equation $$ \left[\begin{array}{cccc} x_1(t) & x_2(t) & \cdots & x_n(t) \\ x_1'(t) & x_2'(t) & \cdots & x_n'(t) \\ \vdots & \vdots & \ddots & \vdots \\ x_1^{(n-1)}(t) & x_2^{(n-1)}(t) & \cdots & x_n^{(n-1)}(t) \end{array}\right] \left[\begin{array}{c} \alpha_1 \\ \alpha_2 \\ \vdots \\ \alpha_n\end{array}\right] = 0. $$ The existence of a non-trivial solutions $[\alpha_j]$ forces the determinant of the coefficient matrix--which is the Wronskian--to vanish for all $t$. Therefore the Wronskian of the solutions $x_1,x_2,\cdots,x_n$ vanishes identically if the solutions are linearly dependent.

Conversely, suppose that the Wronskian $W(x_1,x_2,\cdots,x_n)$ vanishes at some $t \in I$. Then there are constants $\alpha_1,\alpha_2,\cdots,\alpha_n$ such that the matrix equation of the previous paragraph holds. Hence, $$ x(t) = \alpha_1 x_1(t)+\alpha_2 x_2(t) + \cdots + \alpha_n x_n(t) $$ is a solution of your ODE which satisfies $x(t)=x'(t)=\cdots=x^{(n-1)}(t)=0$. By uniqueness of solutions, $x\equiv 0$, which proves that the set of functions $\{x_1,x_2,\cdots,x_n\}$ is a linear independent set of functions. $\blacksquare$

To see that the set of solutions is an n-dimensional vector space, let $S$ be the set of solutions on the interval $I$, and let $t \in I$. Then the map $$ M : S \rightarrow \mathbb{R}^{n} $$ defined by $$ Mx = \left[\begin{array}{c}x(t) \\ x'(t) \\ \vdots \\ x^{(n-1)}(t)\end{array}\right] $$ is linear. This map is injective because $Mx=0$ iff $x\equiv 0$ by uniqueness of solutions. The map is surjective because of the existence of solutions. So $S$ is $n$-dimensional because $M$ is a linear isomorphism.