3
$\begingroup$

As part of determining the expression for the gradient in terms of an arbitrary inner product, I arrived at the following problem: Given:

  • $y = (y^1, \dots, y^n) \in \mathbb{R}^n$ is a selected point
  • $h = (h^1, \dots, h^n) \in \mathbb{R}^n$ is an arbitrary point.
  • $f$ is a real-valued function defined on $\mathbb{R}^n$
  • $[g_{ij}]$ is a positive definite, symmetric $n \times n$ matrix

I'm trying to show that

$$ \sum\limits_{j=1}^n \partial_{j}f(x_0)h^j = \sum\limits_{j,k=1}^n g_{jk}y^j h^k $$

implies

$ \partial_{j} f(x_0) = \sum\limits_{k=1}^n g_{jk}y^k $

The only way I can reasonably see how to arrive at the conclusion is to reason as follows: Since the antecedent holds for arbitrary points in $\mathbb{R}^n$ it must hold for the particular $n$ points associated with the standard basis vectors $e_1, \dots, e_n$ So, for example, if we take the first point $h = (1, 0, \dots, 0)$ it follows that $\partial_{1}f(x_0) = \sum\limits_{k=1}^n g_{1k}y^k$ and so on and hence for the $j^{th}$ point we have the conclusion above.

So, my questions:

  • Is this line of reasoning correct?
  • Is there a better or more direct way to demonstrate the conclusion?

EDIT

I'm updating this post to provide additional contextual information.

Given that $[g_{ij}]$ is a positive-definite and symmetric $n \times n$ matrix, it can be shown that the function

$ (. | . )^g:\mathbb{R}^n \rightarrow \mathbb{R} $

given by

$ (x | y)^g = \sum\limits_{j,k=1}^n g_{jk}y^j x^k $

is a scalar product on $\mathbb{R}^n$. Now, let $f$ be a function on $\mathbb{R}^n$ that is differentiable at $x_0$. By the Riesz represenation theorem (for finite-dimensional Hilbert spaces) since $df(x_0)$ is a continuous linear form there exists a unique vector $y$ such that $df(x_0)h = (y | h)^g \; \forall h\in \mathbb{R^n}$

This unique vector $y$ is defined to be the gradient of $f$ at $x_0$ with respect to the scalar product $(x | y)^g$ and is denoted by $y = \nabla^g f(x_0)$ I am working through the details of the proof that

$ \nabla^g f(x_0) = (g^{1k}\partial_{k}f(x_0), \dots, g^{nk}\partial_{k}f(x_0)) $

where the repeated upper/lower indices indication summation from $1 \dots n$ and $g^{ij}$ represents the $i-j$ entry of the inverse of the matrix $[g_{ij}]$

  • 0
    Yes, that's exactly the point I was trying to make. Okay, very good, then. See you around!2011-06-02

1 Answers 1

1

If you implant ${\rm grad}f\cdot h^{\top}=yG\cdot h^{\top}$ for all $h$, then $({\rm grad}f-yG)\cdot h^{\top}=0$

so ${\rm grad }f=yG$ is exactly what you are claiming.

The qualities of $Q$ makes no difference.