12
$\begingroup$

Question: Let $V\ $ be the vector space of the polynomials over $\mathbf{R}$ of degree less than or equal to 3, with the inner product $$ (f|g) = \int_0^1 f(t)g(t) dt. $$ If $t$ is a real number, find the polynomial $g_t$ in $V$ such that $(f|g_t) = f(t)$ for all $f$ in $V$.

My Attempt: The way I thought to do it was, let $f(x) = a_0 + a_1x + a_2x^2 + a_3x^3$ and $g_t(x) = b_0 + b_1x + b_2x^2 + b_3x^3$. $$(f|g_t) = \sum_{j, k} \frac{1}{1 + j + k} a_j b_k $$

Since $(f|g_t) = f(t)$, I get $$t^j = \sum_k \frac{1}{1 + j + k}b_k.$$

Let $A$ be the matrix $A_{kj} = \frac{1}{1 + j + k}$, so $$ (b_0, b_1, b_2, b_3)A = (1, t, t^2, t^3) $$

Thus $$(b_0, b_1, b_2, b_3) = (1, t, t^2, t^3)A^{-1}.$$

I can compute $A^{-1}$ and that would give me the answer, I think, but it seems like a lot of work, and I would not be using any of the information from the chapter to solve it. I am assuming there is a lot easier way to do this.

The chapter is called "Linear Functionals and Adjoints" from Linear Algebra by Hoffman and Kunze.

EDIT: I think the way the chapter wanted me to do this was the following.

Find an orthonormal basis using Gram Schmidt, say $f_1, f_2, f_3, f_4$. Then let $L_t(f) = f(t)$.

We can then let $$g_t = L_t(f_1)f_1 + L_t(f_2)f_2 + L_t(f_3)f_3 + L_t(f_4)f_4.$$

Then say $f = a_1f_1 + a_2f_2 + a_3f_3 + a_4f_4$.

$$ \begin{align*} (f| g_t) &= a_1L_t(f_1)(f_1| f_1) + a_2L_t(f_2)(f_2| f_2) + a_3L_t(f_3)(f_3| f_3) + a_4L_t(f_4)(f_4| f_4) \\ &= L_t(a_1f_1 + a_2f_2 + a_3f_3 + a_4f_4) = L_t(f) = f(t). \end{align*}$$

The computation is still more than I want to do, but the ideas are all there. I guess this was more focused on the linear functional part of the chapter, instead of the adjoint part.

  • 2
    The trick is that you (hopefully) know the determinant of a Cauchy matrix. Now, not only the matrix $A$ itself, but also each of its minors is a Cauchy matrix, and thus computing the adjoint of $A$ is easy using the formula for the determinant of a Cauchy matrix. Now that you have the adjoint of $A$, you can get the inverse $A^{-1}$ by dividing it through the determinant of $A$ (which is, as I said, a determinant of a Cauchy matrix as well). This simplifies the computation of $A^{-1}$ a lot. Of course, nobody forces you to use adjoints (unless you do the degree $n$ generalization!).2011-12-03
  • 1
    I'll read up on Cauchy matrices. I have never heard of them before.2011-12-03
  • 1
    More specifically, what you have is a special case of a Cauchy matrix, called the Hilbert matrix.2011-12-04
  • 0
    Yes, but the minors won't be Hilbert matrices (in general).2011-12-04
  • 0
    Thanks, I found another way to go about the problem. I think my mistake was focusing too much on the chapter title about adjoints. I am sure I can go about it the Cauchy matrix way, but I don't really understand it as of right now.2011-12-04

3 Answers 3

3

This is a classic application of the Riesz Representation Theorem in a finite dimensional setting. For clarity, let's restate the theorem in this context. (Google for more general versions.)

Riesz Representation Theorem: Let $V$ be a finite dimensional vector space over $\mathbb{R}$ and $\langle\cdot,\cdot\rangle$ be an inner product on $V$. Then for every linear functional $\ell:V\to\mathbb{R}$, there is a unique $g_\ell\in V$ such that $\ell(f)=\langle f,g_\ell\rangle$ for all $f\in V$.

In other words, under certain assumptions, every linear functional can be "represented" (uniquely) as an inner product of the input against some "special" (but fixed) member of $V$.

In your problem, $V=\mathbb{P}_3$, we have the standard (real) inner product, and you are looking for the Riesz "representer" $g_\ell$ for the so-called evaluation functional given by $\ell(f):=f(t_0)$, where $t_0\in[0,1]$ is arbitrary but fixed.

So how do we determine the unique Riesz representer $g_\ell$? To answer this, let $\{e_1,\dots,e_n\}$ be an orthonormal basis for $V$. (For example, pick your favorite basis for $V$, then Gram-Schmidt it to obtain an orthonormal basis.)

Claim: $g_\ell=\sum_{i=1}^n \ell(e_i)e_i$ is the (unique) Riesz representer for the linear functional $\ell$, i.e., $\ell(f)=\langle f,g_\ell \rangle$ for all $f\in V$.

To verify the claim, let $f\in V$. Then we can write $f=\sum_{i=1}^n c_ie_i$ and $$\langle f,g_\ell\rangle = \left\langle \sum_{i=1}^n c_i e_i,\sum_{i=1}^n \ell(e_i)e_i\right\rangle= \sum_{i=1}^n c_i \ell(e_i)\langle e_i,e_i\rangle = \sum_{i=1}^n c_i \ell(e_i) = \ell\left(\sum_{i=1}^n c_ie_i\right) = \ell(f).$$ (I'll leave uniqueness to you.)

Again, to bring all of this back to your particular context, you will need an orthonormal basis for $\mathbb{P}_3$ (for example, the shifted Legendre polynomials) and need to recognize that the particular functional in your question is indeed the evaluation functional.

Hope that helps.

1

Your first solution method is not difficult if you solved Hoffman and Kunze's Exercise 12 of Section 1.6 because then you have a formula for the inverse matrix.

\begin{align*} f(t) =\ & (f|g_t)\\ a_0 + a_1 t + a_2 t^2 + a_3 t^3 =\ & a_0 b_0 + + (a_1 b_0 + a_0 b_1)/2 + (a_2 b_0 + a_1 b_1 + a_0 b_2)/3 + (a_3 b_0 + a_2 b_1 + a_1 b_2 + a_0 b_3)/4\ +\\ &(a_3 b_1 + a_2 b_2 + a_1 b_3)/5 + (a_3 b_2 + a_2 b_3)/6 + a_3 b_3 / 7\\ =\ & a_0(b_0 / 1 + b_1 / 2 + b_2 / 3 + b_3 / 4)\ +\\ & a_1(b_0 / 2 + b_1 / 3 + b_2 / 4 + b_3 / 5)\ +\\ & a_2(b_0 / 3 + b_1 / 4 + b_2 / 5 + b_3 / 6)\ +\\ & a_3(b_0 / 4 + b_1 / 5 + b_2 / 6 + b_3 / 7).\end{align*} Equate the coefficients of the $a_k$ on each side of the equation to get a system of four equations with the four unknown $b_k$, put that system in matrix form, and solve using the inverse of the coefficient matrix as mentioned above:\begin{align*} \begin{bmatrix} 1 & 1/2 & 1/3 & 1/4\\ 1/2 & 1/3 & 1/4 & 1/5\\ 1/3 & 1/4 & 1/5 & 1/6\\1/4 & 1/5 & 1/6 & 1/7\end{bmatrix} \begin{bmatrix} b_0\\ b_1\\ b_2\\ b_3\end{bmatrix} & = \begin{bmatrix} 1\\t\\ t^2\\ t^3\end{bmatrix}\\ \begin{bmatrix} b_0\\ b_1\\ b_2\\ b_3\end{bmatrix} & = \begin{bmatrix} 16 & -120 & 240 & -140\\ -120 & 1200 & -2700 & 1680\\ 240 & -2700 & 6480 & -4200\\ -140 & 1680 & -4200 & 2800\end{bmatrix} \begin{bmatrix} 1\\t\\ t^2\\ t^3\end{bmatrix}.\end{align*} Thus,\begin{alignat*}{6} g_t(x) =\ && 16 &\ -\ & 120t &\ +\ & 240t^2 &\ -\ & 140t^3\hphantom{\Big)} && +\\ &\Big(& -120 &\ +\ & 1200t &\ -\ & 2700t^2 &\ +\ & 1680t^3\Big) & x & +\\ &\Big(& 240 &\ -\ & 2700t &\ +\ & 6480t^2 &\ -\ & 4200t^3\Big) & x^2 &\ +\\ &\Big(& -140 &\ +\ & 1680t &\ -\ & 4200t^2 &\ +\ & 2800t^3\Big) & x^3.\end{alignat*}

0

The solution method described in your edit does not involve too much work if you solved Hoffman and Kunze's Exercise 9 of Section 8.2 because then you already have an orthogonal basis from which you can make the orthonormal basis. In that exercise, we start with $\{1, x, x^2, x^3\}$ and compute an orthonormal basis $\left\{1, \sqrt3(2x - 1), \sqrt5\left(6x^2 - 6x + 1\right), \sqrt7\left(20x^3 - 30x^2 + 12x - 1\right)\right\}$.

Using the first formula in the proof of Theorem 6 of Section 8.3 or the formula in JohnD's answer, we have\begin{align*} g_t(x) =\ & 1 \cdot 1\ +\\ & \sqrt3(2t - 1) \cdot \sqrt3(2x - 1)\ +\\ & \sqrt5\left(6t^2 - 6t + 1\right) \cdot \sqrt5\left(6x^2 - 6x + 1\right)\ +\\ & \sqrt7\left(20t^3 - 30t^2 + 12t - 1\right) \cdot \sqrt7\left(20x^3 - 30x^2 + 12x - 1\right).\end{align*} Expanding and collecting terms by powers of $x$, we get\begin{alignat*}{6} g_t(x) =\ && 16 &\ -\ & 120t &\ +\ & 240t^2 &\ -\ & 140t^3\hphantom{\Big)} && +\\ &\Big(& -120 &\ +\ & 1200t &\ -\ & 2700t^2 &\ +\ & 1680t^3\Big) & x & +\\ &\Big(& 240 &\ -\ & 2700t &\ +\ & 6480t^2 &\ -\ & 4200t^3\Big) & x^2 &\ +\\ &\Big(& -140 &\ +\ & 1680t &\ -\ & 4200t^2 &\ +\ & 2800t^3\Big) & x^3.\end{alignat*}