5
$\begingroup$

Famously, if functions $f_1,f_2,…,f_n$, each of which possesses a derivative of order $n-1$, are linearly independent on the interval $I$, if $ \det\left( \begin{array}{ccccc} f_1 & f_2 & f_3 &… &f_n \\ f'_1 & f'_2 & f'_3 &... &f'_n \\ ⋮ & ⋮ & ⋮ &⋮ &⋮ \\ f_1^{(n-1)} & f_2^{(n-1)} & f_3^{(n-1)} &... &f_n^{(n-1)} \end{array} \right) $ called Wronskian of $f_1,f_2,…,f_n$ ,is not zero for at least one point in the interval $I$. Equivalently, if functions $f_1,f_2,…,f_n$ possess at least $n-1$ derivatives and are linearly dependent on $I$ then $W(f_1,f_2,…,f_n)(x)=0$ for every $x\in I$. So this equivalent statement gives just a necessary condition for dependency of above functions on the interval. Fortunately, there is necessary and sufficient condition for dependency of a set of functions $f_1(x),f_2(x),…,f_n(x), x\in I$:

A set of functions $f_1(x),f_2(x),…,f_n(x), x\in I$ is linearly dependent on $I$ iff the determinant below is identically zero on $I$: $ \det\left( \begin{array}{ccccc} \int_{a}^{b} f_1^2 dx& \int_{a}^{b} f_1f_2 dx&… &\int_{a}^{b}f_1f_ndx \\ \int_{a}^{b}f_2f_1dx & \int_{a}^{b}f_2^2 dx &... &\int_{a}^{b}f_2f_ndx \\ ⋮ & ⋮ & ⋮ &⋮ \\ \int_{a}^{b}f_nf_1dx & \int_{a}^{b}f_nf_2dx&... &\int_{a}^{b}f_n^2dx \end{array} \right) $

It seems to be a great practical Theorem, but I couldn't find its proof. I really appreciate your help.

  • 0
    No, the proof of the converse is also direct. If the rows add up to zero (with coefficients $\alpha_i$), you get that $\sum \alpha_i v_i$ is orthogonal to all $v_1,\dots,v_n$. So either the sum is zero, or the span of $v_i$ is not the entire space. Either way, you have linear dependence.2012-06-08

2 Answers 2

10

If we assume that $f_j$ are continuous, then we define an inner product by $\langle f,g\rangle=\int_a^bf(x)g(x)dx$. Consider $g_1,\ldots,g_n$ such that $\{g_1,\ldots,g_n\}$ is orthonormal and $\operatorname{span}\{g_1,\ldots,g_n\}=\operatorname{span}\{f_1,\ldots,f_n\}$. We can write $f_i=\sum_{j=1}^n\alpha_{ij}g_j$ and if $\alpha$ denotes the matrix whose entries are $\alpha_{i,j}$ we have that $G=\alpha \alpha^T$, where $G$ is the last matrix of the OP.

The matrix $G$ is invertible if and only if so is $\alpha$, which gives the result.

Indeed, if $\sum_{j=1}^n\beta_k\alpha_{k,j}=0$ for some $\beta_k$ not all $0$ then $\sum_k\beta_kf_k=0$.

8

This is really a fact about real finite dimensional vector spaces with an inner product. Suppose $V$ is of dimension $n,$ with a positive definite inner product $\langle , \rangle.$ Suppose we take an orthonormal basis $e_i.$ Finally, suppose we have a set of $n$ vectors $f_j.$ Define two determinants, $ A = \det \left( \langle e_i, f_j \rangle \right), $ and $ B = \det \left( \langle f_i, f_j \rangle \right). $ Then $ A^2 = B. $ The determinant $A$ is just the set of coefficients of the $f_j$ in terms of the $e_i,$ so the determinant is $0$ if and only if the $f_j$ are dependent. Same for $B.$