4
$\begingroup$

I'm having difficulties with this exercise (from Elon LIMA's Curso de Análise, Vol. 2):

$f:U\longrightarrow\mathbb{R}$ is function, differentiable on the open set $U\subset\mathbb{R}^n$. Let $\{v_1,...,v_n\}$ be an arbitrary basis of $\mathbb{R}^n$ and $g^{ij}:=\left$. Show that grad$f(x)$ in this basis is $\textrm{grad} f(x) = \sum_i(\sum_j g^{ij}\frac{\partial f}{\partial v_j})v_i$ where $\frac{\partial f}{\partial v_j}$ is the directional derivative of $f$ along the vector $v_j$.

I tried starting with the RHS with $\frac{\partial f}{\partial v_j}=\left<\textrm{grad}f(x),v_j\right>$ and arriving at $\sum_i \left<\textrm{grad}f(x),e_i\right>e_i $ which is the "canonical" (see Note below) expression for the grad; writing each $v_i$ as $v_i=\sum_j \left$ I come down to a sum of the form $\sum_{i,j}\leftg^{ji}\left$ which I don't recognize and doesn't seem to simplify.

Am I approaching it incorrectly? How must this problem be tackled? I'm obviously missing something, or misinterpreting things.

NOTE: In the textbook, grad$f(x)$ is defined as $\sum_i \frac{\partial f}{\partial x_i}(x)e_i$ where $\frac{\partial f}{\partial x_i}(x)$ is the usual i-th partial derivative at the point $x$.

  • 1
    I meant the gradient vector, of course (too late to edit). The graduate committee work is getting to me. :(2012-07-14

1 Answers 1

3

Your definition of $g^{ij}$ is incorrect. We have that $g_{ij} = \langle v_i, v_j \rangle$ and then $g^{ij}$ is the $(i,j)$-entry of the inverse matrix $(g_{ij})^{-1}$. With this correction, your approach is fine, except you should have $v_i = \sum_{j = 1}^n \langle v_i, e_j \rangle e_j$ (you didn't have the last $e_j$, perhaps it was a typo). Starting with the right-hand side, the computation goes like this: \begin{align*} \sum_{i, j = 1}^n g^{ij} \frac{\partial f}{\partial v_j} v_i & = \sum_{i, j = 1}^n g^{ij} \langle \operatorname{grad} f, v_j \rangle v_i \\ & = \sum_{i,j = 1}^n g^{ij} \left\langle \operatorname{grad} f, \sum_{k = 1}^n \langle v_j, e_k \rangle e_k \right\rangle \sum_{l = 1}^n \langle v_i, e_l \rangle e_l \\ & = \sum_{i,j,k,l = 1}^n g^{ij} \left\langle \operatorname{grad} f, \langle v_j, e_k \rangle e_k \right\rangle \langle v_i, e_l \rangle e_l \\ & = \sum_{i,j,k,l = 1}^n g^{ij} \langle v_i, e_l \rangle \langle v_j, e_k \rangle \langle \operatorname{grad} f, e_k \rangle e_l \\ & = \sum_{i,j,k,l = 1}^n g^{ij} \delta_{kl} (v_i)_l (v_j)_k \langle \operatorname{grad} f, e_k \rangle e_l \\ & = \sum_{i,j,k = 1}^n g^{ij} (v_i)_k (v_j)_k \langle \operatorname{grad} f, e_k \rangle e_k \\ & = \sum_{i,j = 1}^n g^{ij} g_{ij} \sum_{k = 1}^n \langle \operatorname{grad} f, e_k \rangle e_k \\ & = \sum_{i,j = 1}^n \delta_{ij} \sum_{k = 1}^n \langle \operatorname{grad} f, e_k \rangle e_k \\ & = \sum_{k = 1}^n \langle \operatorname{grad} f, e_k \rangle e_k \\ & = \operatorname{grad} f. \end{align*}