8
$\begingroup$

I am self-studying Hoffman and Kunze's book Linear Algebra. This is Exercise 3 from page 111.

Let $S$ be a set, $\mathbb{F}$ a field, and $V(S,\mathbb{F})$ the space of all functions from $S$ into $\mathbb{F}:$ $(f+g)(x)=f(x)+g(x)\hspace{0.5cm}(\alpha f)(x)=\alpha f(x).$ Let $W$ be any $n$-dimensional subspace of $V(S,\mathbb F)$. Show that there exist points $x_{1},\ldots,x_{n}\in S$ and functions $f_{1},\ldots, f_{n}\in W$ such that $f_{i}(x_{j})=\delta_{ij}$.

Since $W$ is an $n$-dimensional subspace of $V(S,\mathbb{F})$ we can say find a basis $\mathcal{B}=\{f_{1},\ldots, f_{n}\}$. But I got stuck here. I don't know what to do from now on. I mean, what should I do in order to find those points $x_{1},\ldots,x_{n}\in S$ such that $f_{i}(x_{j})=\delta_{ij}$.

PS:This is the section about the double dual.

  • 1
    I had searched this site several times for a solution to this problem but never found your post. That's because you didn't put in sufficient key words. You should identify the chapter and section not just the page number. This is Hoffman and Kunze Chapter 3, Section 6, problem/exercise 3. AKA problem 3.6.3. Hopefully having that in my comment will help the next person find it.2017-11-14

6 Answers 6

3

For each $x$ in $S$ define $e_x\in W^*$ by $e_x(f):=f(x)$.

Claim: the subspace $W'$ of $W^*$ generated by the $e_x$ is equal to $W^*$.

Proof of the claim: Let $W'^\perp$ be the orthogonal of $W'$ in $W$. By biduality, it suffices to show $W'^\perp=0$, which is clear.

Let $x_1,\dots,x_n$ be elements of $S$ such that the $e_{x_i}$ form a basis $B$ of $W^*$. The basis dual to $B$ is a basis of $W^{**}$, but, again by biduality, it can be viewed as a basis $f_1,\dots,f_n$ of $W$.

7

You can prove this by induction on $n$. Here's a sketch:

Base Case: ($n = 0$ is totally trivial). $n =1$: a one-dimensional subspace of $\mathbb{F}^S$ is the set of scalar multiples of a single not identically zero function $f: S \rightarrow \mathbb{F}$. So there exists some $x \in S$ such that $f(x) \neq 0$, and then by rescaling there exists some $x \in S$ and $\alpha \in \mathbb{F}$ such that $\alpha f(x) = 1$.

Inductive Step: Suppose that the result holds for any $n$-dimensional subspace $W = \langle f_1,\ldots,f_n \rangle$, and now suppose that we add to $W$ one linearly independent function $g$. By induction there is a subset $S_n = \{x_1,\ldots,x_n\}$ of $S$ such that that elements of $W$, when restricted to functions on $S_n$, give all possible functions on $S_n$. Therefore there is some linear combination of the $f_i$'s which induces the same function on $S_n$ as $g$ does, i.e., there are scalars $\alpha_1,\ldots,\alpha_n$ such that $(g - \sum_{i=1}^n \alpha_i f_i)(x_j) = 0$ for all $1 \leq j \leq n$. But since $g$ is linearly independent from $W$, $(g - \sum_{i=1}^n \alpha_i f_i)$ is not the zero function. Can you complete the argument from here?

By the way, I agree that double duality is also relevant. But I think the above approach is more "hands on" -- after proving it this way, one can think about what it means in terms of double dual spaces.

  • 0
    Then there is $x_{n+1}\in S$ and $x_{n+1}\notin S_{n}$ such that $(g-\sum_{i=1}^{n}\alpha_{i}f_{i})(x_{n+1})=\alpha\neq 0$. Then we put $f_{n+1}=\dfrac{1}{\alpha}(g-\sum_{i=1}^{n}f_{i})$.2012-03-18
6

For your basis $\mathcal B$, for each $x$ consider the $n$-dimensional vector with components $f_i(x)$. There is a linearly independent set of $n$ of these vectors. If $S$ is finite, this follows directly because the matrix formed by all these vectors has rank $n$ because $\mathcal B$ is a basis. It also holds for infinite $S$, however, for if not, some $n-1$ of these vectors would have to span all of them; the matrix formed by those $n-1$ vectors would have rank at most $n-1$, so it would be possible to express one of the functions as a linear combination of the others at the corresponding $n-1$ points, but then since the vectors corresponding to the remaining points are spanned by the $n-1$ vectors, the one function would in fact be identical to the linear combination of the others at all points, contrary to the fact that $\mathcal B$ is a basis.

So we have $n$ points $x_1,\dotsc,x_n$ such that the corresponding $n$ vectors $f_i(x_j)$ are linearly independent. But then their entries $A_{ij}=f_i(x_j)$ form an invertible matrix, and since

$\delta_{ij}=\sum_kA^{-1}_{ik}A_{kj}=\sum_kA^{-1}_{ik}f_k(x_j)=\left(\sum_kA^{-1}_{ik}f_k\right)(x_j)$

the points $x_1,\dotso,x_n\in S$ and the functions $\sum_kA^{-1}_{ij}f_k\in W$ have the desired property.

  • 0
    @Majid: Thanks for letting me know :-)2018-07-03
4

The solutions posted above can't be what H&K had in mind since they do not use the double dual. I came up with the following solution.

Let $s\in S$. We first show that the function \begin{alignat*}{1} \phi_s:&W\rightarrow F\\ &w\mapsto w(s) \end{alignat*} is a linear functional on $W$ (in other words for each $s$, we have $\phi_s\in W^*$).

Let $w_1,w_2\in W$, $c\in F$. Then $\phi_s(cw_1+w_2)=(cw_1+w_2)(s)$ which by definition equals $cw_1(s)+w_2(s)$ which equals $c\phi_s(w_1)+\phi_s(w_2)$. Thus $\phi_s$ is a linear functional on $W$.

Suppose $\phi_s(w)=0$ for all $s\in S$, $w\in W$. Then $w(s)=0$ $\forall$ $s\in S$, $w\in W$, which implies $\dim(W)=0$. So as long as $n>0$, $\exists$ $s_1\in S$ such that $\phi_{s_1}(w)\not=0$ for some $w\in W$. Equivalently there is an $s_1\in S$ and a $w_1\in W$ such that $w_1(s_1)\not=0$. This means $\phi_{s_1}\not=0$ as elements of $W^*$. It follows that $\langle\phi_{s_1}\rangle$, the subspace of $W^*$ generated by $\phi_{s_1}$, has dimension one. By scaling if necessary, we can further assume $w_1(s_1)=1$.

Now suppose $\forall$ $s\in S$ that we have $\phi_s\in\langle\phi_{s_1}\rangle$, the subspace of $W^*$ generated by $\phi_{s_1}$. Then for each $s\in S$ there is a $c(s)\in F$ such that $\phi_s=c(s)\phi_{s_1}$ in $W^*$. Then for each $s\in S$, $w(s)=c(s)w(s_1)$ for all $w\in W$. In particular $w_1(s)=c(s)$ (recall $w_1(s_1)=1$). Let $w\in W$. Let $b=w(s_1)$. Then $w(s)=c(s)w(s_1)=bw_1(s)$ $\forall$ $s\in S$. Notice that $b$ depends on $w$ but does not depend on $s$. Thus $w=bw_1$ as functions on $S$ where $b\in F$ is a fixed constant. Thus $w\in\langle w_1\rangle$, the subspace of $W$ generated by $w_1$. Since $w$ was arbitrary, it follows that $\dim(W)=1$. Thus as long as $\dim(W)\geq 2$ we can find $w_2\in W$ and $s_2\in S$ such that $\langle w_1,w_2\rangle$ (the subspace of $W$ generated by $w_1,w_2$) and $\langle\phi_{s_1},\phi_{s_2}\rangle$ (the subspace of $W^*$ generated by $\{\phi_{s_1},\phi_{s_2}\}$) both have dimension two. Let $W_0=\langle w_1,w_2\rangle$. Then we've shown that $\{\phi_{s_1},\phi_{s_2}\}$ is a basis for $W_0^*$. Therefore there's a dual basis $\{F_1,F_2\}\subseteq W_0^{**}$; so that $F_i(\phi_{s_j})=\delta_{ij}$, $i,j\in\{1,2\}$. By Theorem 17, $\exists$ corresponding $w_1,w_2\in W$ so that $F_i=L_{w_i}$ (in the notation of Theorem 17). Therefore, $\delta_{ij}=F_i(\phi_{s_j})=L_{w_i}(\phi_{s_j})=\phi_{s_j}(w_i)=w_i(s_j)$, for $i,j\in\{1,2\}$.

Now suppose $\forall$ $s\in S$ that we have $\phi_s\in\langle\phi_{s_1},\phi_{s_2}\rangle\subseteq W^{*}$. Then $\forall$ $s\in S$, there are constants $c_1(s),c_2(s)\in F$ and we have $w(s)=c_1(s)w(s_1)+c_2(s)w(s_2)$ for all $w\in W$. Similar to the argument in the previous paragraph, this implies $\dim(W)\leq 2$ (for $w\in W$ let $b_1=w(s_1)$ and $b_2=w(s_2)$ and argue as before). Therefore, as long as $\dim(W)\geq3$ we can find $s_3$ so that $\langle\phi_{s_1},\phi_{s_2},\phi_{s_3}\rangle\subseteq W^*$, the subspace of $W^*$ generated by $\phi_{s_1},\phi_{s_2},\phi_{s_3}$, has dimension three. And as before we can find $w_3\in W$ such that $w_i(s_j)=\delta_{ij}$, for $i,j\in\{1,2,3\}$.

Continuing in this way we can find $n$ elements $s_1,\dots,s_n\in S$ such that $\phi_{s_1},\dots,\phi_{s_n}$ are linearly independent in $W^{*}$ and corresponding elements $w_1,\dots,w_n\in W$ such that $w_i(s_j)=\delta_{i,j}$. Let $f_i=w_i$ and we are done.