2
$\begingroup$

The following problem was taken from Halmos's Finite Dimensional Vector Spaces:

Let $(a_0, a_1, a_2, \ldots)$ be an arbitrary sequence of complex numbers. Let $x \in P$, where $P$ is the vector space of all polynomials over $\mathbb{C}$. Write $x(t) = \sum_{i=0}^n \xi_it^i$ and $y(x) = \sum_{i=0}^n \xi_i a_i$. Prove that $y(x)$ is an element of the dual space $P'$ consisting of all linear functionals on $P$ and that every element of $P'$ can be obtained in this manner by a suitable choice of the $a_i$.

Now the first part of the problem is not hard to show as $y$ is the functional that takes some polynomial in the vector space, evaluates it at $1$ and to each $\xi_i$ multiplies an $a_i$, then sums all these numbers together. Then it only remains to check that for such a $y$,

$y(Ax_1 + bx_2) = Ay(x_1) + By(x_2)$, where $x_1,x_2 \in P$ and $A,B \in \mathbb{C}$.

Now for the second part, I don't know how to prove that every linear functional in the dual space must have the form of $y$ above. I can think of specific examples e.g. integrals of polynomials and see why this is true, but it is plain that this will not suffice.

There is a more promising approach I can think of. This may not be correct but if we view the $\xi_i's$ as some "basis" (i.e. prove they are linearly independent) and somehow the linear combination represented by $y(x)$ "spans" the dual space then this may suffice.

Perhaps even related is the question If $y$ is a non-zero functional on a vector space $V$ and if $\alpha$ is an arbitrary scalar, does there necessarily exist a vector $x \in V$ such that $y(x) = \alpha$?

Please do not leave complete answers as I would like to complete this myself.

$\textbf{Note}$: The notation $y(x)$ does not mean a function evaluated at a point e.g. $y=f(x)=x^2$ but rather a functional $y$ evaluated at a vector $x$ belonging to some vector space.

  • 0
    Let $f$ a linear functional. It's determined by the $f(X^k), k\in\mathbb N$. What is the link between $f(X^k)$ and $a_k$?2011-07-12
  • 0
    Apply $y \in P'$ to monomials... (if $x_k = t^k$ then $y(x_k) = ?$)2011-07-12
  • 0
    Hint: a linear functional on a vector space $V$ is completely determined by its values on a basis of $V$. The tuple $(1,x,x^2,\dots)$ is a basis of the vector space of polynomials with coefficients in $\mathbb{C}$ ...2011-07-12
  • 0
    A note to your note: well, functionals *are* functions, in the usual set-theoretic meaning! So the notation $y(x)$ for a functional $y\in P'$ and a vector $x\in P$ *is* the same notation as ever.2011-07-12
  • 0
    @Theo buehler I am guessing your comment is related to statements (1) and (2) of Amitesh's answer below.2011-07-12
  • 0
    Yes, exactly. By the way, the first name suffices :)2011-07-12
  • 0
    @Theo Please see my latest comment below. I think that proves the problem in Halmos that I was having a difficulty with.2011-07-12

2 Answers 2

1

It is not hard to show that for each $i=1,2, \ldots n$, $ \exists! \Lambda_i$ s.t. $\Lambda_i[v_i] = \delta_{ij}$. Now assume that these $\Lambda_i's$ are linearly dependent, viz. there exists scalars $a_1,a_2, \ldots a_n$, not all zero s.t.

$a_1\Lambda_1 + a_2\Lambda_2 \ldots a_n\Lambda_n = 0.$

Let the L.H.S. operate on $v_1$, so that $a_1 = 0$. Continuing this process for all the vectors in the basis shows that every $a_i$ is zero. It follows that the set of linear functionals $\{\Lambda_1, \Lambda_2 \ldots \Lambda_n\}$ is linearly independent. Now for any $v \in V,$ write $v = \sum_{i=1}^n \xi_i v_i$, where the $\xi_i \in \mathbb{F}$. Then by definition of the $\Lambda_i$, we have

$\Lambda_1[v] = \xi_1, \quad \Lambda_2[v] = \xi_2, \quad \ldots \quad \Lambda_n[v] = \xi_n.$

Next we note that for any $y$ in $V^{*}$

$\begin{eqnarray*} y[v] &=& \xi_1y[v_1] + \xi_2y[v_2] + \ldots \xi_ny[v_n] \\ &=&\Lambda_1[v]y[v_1] + \Lambda_2[v]y[v_2] + \ldots \Lambda_n[v]y[v_n] \end{eqnarray*}$

so that any linear functional is a linear combination of the $\Lambda_i's$. Since these $\Lambda_i's$ are linearly independent, it is plain that $\{\Lambda_1, \Lambda_2 \ldots \Lambda_n\}$ is a basis of the dual space $V^{*}$.

  • 0
    +1 Yes, the proof is essentially correct. However, I should note the following points: (1) In the sentence beginning with "Now for any $v\in V$, write $v=\sum_{i=0}^n \xi_i v_i$ ...", it should be $v=\sum_{i=1}^n \xi_i v_i$ (instead of $\sum_{i=0}^n \xi_i v_i$). (2) The $\xi_i$'s are scalars and it is not relevant to the proof whether or not they are linearly independent (in fact, it is not true for $n>1$). I think you meant that any linear functional is a linear combination of the $\Lambda_i$'s (and not the $\xi_i$'s) toward the end of the proof.2011-07-13
  • 0
    However, these are only typos and the proof is otherwise correct.2011-07-13
  • 0
    ***Exercise 4*** is the most relevant to your question but you will be able to answer it easily since the idea is similar to ***Exercise 3***.2011-07-13
  • 1
    By the way, have you heard of the book *Linear Algebra Done Right* by Sheldon Axler? It is an excellent book (in my opinion) and covers similar ground to Halmos but the approach is different. Axler prefers to avoid the use of determinants in the proofs of some of the key results in linear algebra and this yields a very elegant approach to the subject. You may wish to have a look at Axler's text. However, duality, tensor products, quotient spaces and chapter 4 of Halmos are not covered in Axler. On the other hand, there is a fairly detailed treatment of operators in Axler that is not in Halmos.2011-07-13
  • 0
    I second @Amitesh 's recommendation. You can download some sample chapters on [Axler's homepage](http://linear.axler.net/).2011-07-13
  • 0
    Please also see [Down with Determinants!](http://www.axler.net/DwD.pdf) for a taste of the approach taken by Axler in his textbook.2011-07-13
  • 0
    I have seen Axler's book before and the reason he avoids determinants (according to Axler) is because they are unintuitive and not elegant.2011-07-13
  • 0
    @Amitesh Datta I have made the necessary corrections (got mixed up with the variables before). Just a doubt left: In the last line above we have a functional operating on a vector in the vector space, so from there can we always say that the functional $y$ by itself (operating on nothing) is of the form $\Lambda_1y[v_1] + \Lambda_2y[v_2] + \ldots \Lambda_ny[v_n]$?2011-07-13
  • 2
    @Amitesh Datta By the way just by your exercises I have learnt so many things! I like it this way you don't tell me straight away what the answers are. Thanks for pointing out the mistakes as well: Better to know what mistakes have been made and learn from them.2011-07-13
  • 0
    @Theo Buehler I got to know of Axler's book while asking something on linear algebra; someone from this forum recommended that I download chapter 7 or 8 to obtain some theorem I needed.2011-07-13
  • 0
    @DLim Yes. Note that two linear functionals on $V$ are equal (as elements of the dual space of $V$) if and only if their actions on a particular basis of $V$ are equal. Also, $(\Lambda_1y[v_1]+\cdots+\Lambda_ny[v_n])(v_i)=y[v_i]\Lambda_i(v_i)=y[v_i]$ for all $1\leq i\leq n$. Therefore, $y=\Lambda_1y[v_1]+\cdots+\Lambda_ny[v_n]$.2011-07-13
  • 0
    @DLim Also, thanks for your comment regarding my exercises! It is nice to hear that they are useful. Actually, in my time participating on math.stackexchange.com I have discovered that (if I may say so) I am very good at inventing exercises. I am happy to supply exercises on a particular topic if you ask. Of course, you can also look at some of my other answers on this website where I have (almost always) provided exercises.2011-07-13
  • 0
    @Amitesh Datta Sorry I am a bit confused I understand your comment above but how does that apply to equating the functionals in the last line of my answer above? I sort of get what you are saying (in regards to how it applies to my answer above).2011-07-13
  • 0
    @DLim If $\Gamma_1$ and $\Gamma_2$ are functionals on $V$, then $\Gamma_1=\Gamma_2$ (as functionals) if and only if $\Gamma_1(v)=\Gamma_2(v)$ for all $v\in V$ (by definition). Therefore, $\Gamma_1=\Gamma_2$ (as functionals) if and only if their actions on a particular basis of $V$ are equal. If we apply this statement to the functionals $y$ and $\Lambda_1y[v_1]+\cdots+\Lambda_ny[v_n]$, then we discover that they are equal since their actions on the basis $(v_1,\dots,v_n)$ of $V$ are equal.2011-07-13
3

The following steps lead to a solution:

(1) Prove that the tuple $(1,x,x^2,\dots)$ is a basis for the complex vector space $V$ of all polynomials (in the variable $x$) with coefficients in $\mathbb{C}$.

(2) If $p(x)=\sum_{i=0}^n a_ix^i$ and if $\Lambda$ is a linear functional on $V$, prove that $\Lambda(p(x))=\sum_{i=0}^n a_i\Lambda(x^i)$. (Hint: the linearity of the functional $\Lambda$ is, of course, relevant.)

(3) Therefore, the linear functional $\Lambda$ is completely determined by the values $\Lambda(x^i)$ for $i\geq 0$ an integer.

The following exercises are relevant:

Exercise 1: Prove the Riesz representation theorem for finite dimensional Hilbert spaces:

If $H$ is a finite dimensional Hilbert space and if $\Lambda$ is a linear functional on $H$, then there exists a unique vector $y\in H$ such that $\Lambda(x)=\langle x,y \rangle$ for all $x\in H$. (Note that $\langle \cdot,\cdot \rangle$ is the inner product on $H$.)

(Hint: note that $H$ has an orthonormal basis.)

Exercise 2: Let $V$ be a finite dimensional vector space over $\mathbb{C}$ and let $(v_1,\dots,v_n)$ be a basis of $V$. Prove that if $(a_1,\dots,a_n)$ is a tuple of complex numbers, then the map $\Lambda:V\to\mathbb{C}$ given by $\Lambda(\sum_{i=1} c_iv_i)=\sum_{i=1}^n c_ia_i$ (where $c_i\in\mathbb{C}$ for all $1\leq i\leq n$) is a linear functional on $V$. Conversely, prove that all linear functionals on $V$ have this form.

Exercise 3: Let $V$ be a finite dimensional vector space over $\mathbb{C}$ and let $(v_1,\dots,v_n)$ be a basis of $V$. If $1\leq i\leq n$, let $\Lambda_i:V\to \mathbb{C}$ be defined by the rule $\Lambda_i(v_i)=1$ and $\Lambda_i(v_j)=0$ if $j\neq i$. Prove that $\Lambda_i$ is a linear functional on $V$ for all $1\leq i\leq n$. If $V^{*}$ is the dual space of $V$ (= vector space of all linear functionals on $V$), prove that $(\Lambda_1,\dots,\Lambda_n)$ is a basis of $V^{*}$.

Exercise 4: Let $V$ be the vector space of all polynomials (in the variable $x$) with coefficients in $\mathbb{C}$. If $i\geq 0$, let $\Lambda_i:V\to \mathbb{C}$ be defined by the rule $\Lambda_i(x^i)=1$ and $\Lambda_i(x^j)=0$ if $j\neq i$. Prove that $\Lambda_i$ is a linear functional on $V$ for all $i\geq 0$. Is the tuple $(\Lambda_0,\Lambda_1,\Lambda_2,\dots)$ a basis of the dual space of $V$?

I hope this helps!

  • 0
    @Theo: Dear Theo, thanks for the correction; actually, it was not a typo. I have been using $\Sigma$ instead of $\sum$ ever since I have been typing LaTeX (!) but you are indeed right that $\sum$ is more appropriate. May I ask who Ben is? (I am not sure that there is a Ben involved with this question ...)2011-07-12
  • 0
    @Theo ***Exercise 3*** and ***Exercise 4*** address the paragraph in the OP's question beginning with "There is a more promising approach ...".2011-07-12
  • 2
    Thanks for the updates! Now your exercises are of the usual quality. I removed my earlier comments as they revealed more about D Lim than he himself chose to after the change of his user name.2011-07-12
  • 0
    @Amitesh Datta For exercise 1, we calculate the wronskian of $x^n$ and $x^m$, $n \neq m$ and $n,m \geq 0$. Supposing the wronskian is zero leads to the contradiction that $n=m$. So the wronskian is non-zero and hence the functions $\{1,x,x^2,\ldots \}$ are linearly independent. The fact that these functions spans the whole of the vector space follows from the fact that any polynomial with coefficients in $\mathbb{c}$ is a linear combination of these vectors. So the set you mentioned in exercise 1 is a basis of this vector space.2011-07-12
  • 0
    @Amitesh Datta Exercise 2 follows from writing out the polynomial you stated and then using the property for linearity: $(\Lambda p(x)) = \Lambda(a_0 + a_1x + \ldots a_nx^n) = a_0 + a_1\Lambda(x_1) + \ldots a_n \Lambda (x^n)$ which is what you stated.2011-07-12
  • 0
    Sorry the last two comments were for statements (1) and (2) respectively.2011-07-12
  • 0
    @DLim Indeed. However, I should note that one does not need to compute the Wronskian to prove that the tuple $(1,x,x^2,\dots)$ is linearly independent. The easiest (and most natural) proof uses the fact that a non-zero polynomial with complex coefficients can have only finitely many complex roots.2011-07-12
  • 0
    @Amitesh Datta I think I get the converse of exercise (2): The functional $\Lambda$ is a mapping from the vector space to $\mathbb{C}$. So since every functional is completely determined by the $\Lambda(x^i)$ (which are complex numbers), the result follows immediately.2011-07-12
  • 0
    @DLim: Yes, exactly.2011-07-12
  • 0
    @Amitesh Datta I've typed out the proof of exercise 3 in an answer below. It is too long to be left as a comment.2011-07-13
  • 0
    @DLim I have commented below regarding the proof that you have typed up.2011-07-13