4
$\begingroup$

I'm having difficulties with this exercise (from Elon LIMA's Curso de Análise, Vol. 2):

$f:U\longrightarrow\mathbb{R}$ is function, differentiable on the open set $U\subset\mathbb{R}^n$. Let $\{v_1,...,v_n\}$ be an arbitrary basis of $\mathbb{R}^n$ and $g^{ij}:=\left$. Show that grad$f(x)$ in this basis is $$\textrm{grad} f(x) = \sum_i(\sum_j g^{ij}\frac{\partial f}{\partial v_j})v_i$$ where $\frac{\partial f}{\partial v_j}$ is the directional derivative of $f$ along the vector $v_j$.

I tried starting with the RHS with $\frac{\partial f}{\partial v_j}=\left<\textrm{grad}f(x),v_j\right>$ and arriving at $\sum_i \left<\textrm{grad}f(x),e_i\right>e_i $ which is the "canonical" (see Note below) expression for the grad; writing each $v_i$ as $v_i=\sum_j \left$ I come down to a sum of the form $\sum_{i,j}\leftg^{ji}\left$ which I don't recognize and doesn't seem to simplify.

Am I approaching it incorrectly? How must this problem be tackled? I'm obviously missing something, or misinterpreting things.

NOTE: In the textbook, grad$f(x)$ is defined as $\sum_i \frac{\partial f}{\partial x_i}(x)e_i$ where $\frac{\partial f}{\partial x_i}(x)$ is the usual i-th partial derivative at the point $x$.

  • 1
    Let $w$ be the graduate vector (clearly this equation has nothing to do with derivatives), it's just linear algebra. You want to show that $w=\sum_i v_i \sum_j \langle v_i,v_j\rangle \langle w, v_j\rangle $. Can this be true? No, because if all vectors $v_i$ are multiplied by the same constant $\lambda$, the right hand side gets multiplied by $\lambda^4$. Hence, you have a wrong expression for $g^{ij}$, probably missing matrix inverse.2012-07-14
  • 1
    I meant the gradient vector, of course (too late to edit). The graduate committee work is getting to me. :(2012-07-14

1 Answers 1