1
$\begingroup$

I have this problem. I want to prove that if I'm choosing a basis for polynomials that their exponent is from $1$ until $n$ and they are all $F(2)=0$, they are linear independent.

To be more spefic, let's have a vector space which contains $N$ function, first is with $1$ degree and so on, all are $0$ in $x=2$, why should they be independent??

thanks!

  • 0
    If you are "choosing a basis" of anything then "they" are linearly independent to begin with. Maybe your question is: Can I choose a basis $(F_k)_{0\leq k\leq n}$ of the space of all $\leq n$-degree polynomials such that $F_k(2)=0$ for all $k$? The answer is: No, because the constant 1 couldn't be a linear combination of such $F_k$.2011-02-11

2 Answers 2

5

The first thing you need to do is make it clear what vector space you are looking for. You cannot determine if a set is a basis unless you know exactly what the vector space in question is.

For example, it is not the same to work in the vector space of all polynomials of degree at most $n$ with coefficients in $\mathbb{R}$ than to work in the vector space of all polynomials with coefficients in $\mathbb{R}$; those two vector spaces are also different from the vector space of all continuous functions $f\colon\mathbb{R}\to\mathbb{R}$ (with pointwise addition and scalar multiplication), which in turn is different from the vector space of all differentiable functions, etc.

But then it seems like you are not asking for a basis, but rather how does one check if a given collection of functions is linearly independent. The answer may again depend on the vector space (different scalars can change the answer; for instance, if you consider the complex numbers as a vector space over $\mathbb{R}$, then $1$ and $i$ are linearly independent, because the only solution to $a1+bi$ with $a,b\in\mathbb{R}$ is $a=b=0$; but if you consider $\mathbb{C}$ as a vector space over itself, then $1$ and $i$ are not linearly independent, because if we can pick complex scalars then $a1+bi=0$ has the solution $a=i$ and $b=-1$).

So, assuming you know what your vector space is: suppose $\mathbf{f}_1,\ldots,\mathbf{f}_n$ are your functions. You need to know whether there are any nonzero solutions to $a_1\mathbf{f}_1 + \cdots a_n\mathbf{f}_n = 0$ as functions. That is, the value at all $x$ in the domain should be zero. At that point, there are any number of ways of doing it, depending on the precise nature of the functions. You assume you have that equality, and through any valid means (algebra, calculus, etc) you want to show that you must have $a_1=\cdots=a_n=0$; if this is the case, they are linearly independent. If there is at least one way to obtain the zero function with not all $a_1,\ldots,a_n$ equal to $0$, then they are not linearly independent.

If you care to be a bit more precise as to what your problem is, I and others may be able to provide more specific hints as opposed to generalities.

1

Assuming your answer to my comment ment that your functions are of the form $(x-2),(x-2)^2,(x-2)^3,\ldots$.

Now, pick any finite subset of your set of polynomials. We may assume that this finite subset is of the form $(x-2),(x-2)^2,\ldots,(x-2)^k$ by adding finitely many more functions. So assume we have a dependence relation

$a_1(x-2)+a_2(x-2)^2+\ldots+a_k(x-2)^k=0$.

Differentiating $k-1$ times, we get $(k-1)!a_k(x-2)=0$, and choosing $x=1$ we get $a_k=0$. Continuing this way, we get $a_i=0$ for all $i$. Thus your set of functions is linearly endependent (over the underlying field).

  • 0
    I differentiated $k-1$ times.2011-01-23