5
$\begingroup$

$M$ is riemannian manifold, if a smooth function $f$ satisfies $\left| \operatorname{grad}\ f \right|=1,$ then prove the integral curves of $\operatorname{grad}\ f$ are geodesics.

  • 5
    So ... what is your question? What effort have you put into it?2012-03-13
  • 0
    prove why the integral curves of $grad\ f$ are geodesics2012-03-13
  • 0
    The gradient of a function is a vector. What do you mean by it equalling 1?2012-03-13
  • 0
    sorry, I've edited this question.For this time I can't see the conversion of Latex in my webpage,I can't check questions clearly.2012-03-13
  • 0
    Duplicate of https://math.stackexchange.com/questions/16911/integral-curves-of-the-gradient2018-05-10

3 Answers 3

9

I'll use $\nabla$ for the gradient.

If $|\nabla f| = 1$, we have that $g(\nabla f,\nabla f) = 1$ where $g$ is the metric. Taking the covariant derivatve of the expression you have

$$ 0 = \nabla (1) = \nabla\left( g(\nabla f,\nabla f)\right) = 2 g(\nabla f, \nabla^2 f) = 2 \nabla_{\nabla f} (\nabla f) $$

The third equality used that $\nabla g = 0$ for the Levi-Civita connection of a Riemannian metric, and the fourth inequality uses that the Hessian of a scalar function is symmetric.

Since $\nabla_{\nabla f} \nabla f = 0$, we have that the vector field $\nabla f$ is geodesic, and hence the integral curves are geodesic curves.

3

Well $\text{grad}(f)$ is a vector such that $g(\text{grad}(f),-)=df$, therefore integral curves satisfy $$ \gamma'=\text{grad}(f)\Rightarrow g(\gamma',X)=df(X)=X(f) $$ Now let $X,Y$ be a vector fields $$ XYf=Xg(\text{grad}(f),Y)= g(\nabla_X\text{grad}(f),Y)+g(\text{grad}(f),\nabla_XY)= g(\nabla_X\text{grad}(f),Y)+\nabla_XY(f) $$ and $$ YXf=Yg(\text{grad}(f),X)= g(\nabla_Y\text{grad}(f),X)+g(\text{grad}(f),\nabla_YX)= g(\nabla_Y\text{grad}(f),X)+\nabla_YX(f) $$ which gives after subtraction and vanishing torsion $$ [X,Y]f-\nabla_XY(f)+\nabla_YX(f)=0=g(\nabla_X\text{grad}(f),Y)-g(\nabla_Y\text{grad}(f),X) $$ It follows that $$ g(\nabla_X\text{grad}(f),Y)=g(\nabla_Y\text{grad}(f),X) $$ Now the easy part, substitute $X=\text{grad}(f)$ and conclude that for every $Y$ $$ g(\nabla_{\text{grad}(f)}\text{grad}(f),Y)=g(\nabla_Y\text{grad}(f),\text{grad}(f))=0 $$ The last one because $g(\text{grad}(f),\text{grad}(f))=1$ is constant, so $$ 0=Yg(\text{grad}(f),\text{grad}(f))=2g(\nabla_Y\text{grad}(f),\text{grad}(f)) $$

0

Let $Y$ be any vector field. $$\Bbb{grad}f=(df)^\sharp, g(\Bbb{grad}f,Y)=df(Y)$$ Since $df$ is closed, $$df[\Bbb{grad}f,Y]=\Bbb{grad}f(df(Y))-Y(df(\Bbb{grad}f))$$

$$=\Bbb{grad}f(df(Y))-Yg(\Bbb{grad}f,\Bbb{grad}f)$$ $$=\Bbb{grad}f(df(Y))-Y(1)=\Bbb{grad}f(df(Y))$$ Torsion tensor vanishes identically, hence $$[\Bbb{grad}f,Y]=\nabla _{\Bbb{grad}f}Y-\nabla_Y\Bbb{grad}f$$ Then $$g(\nabla _{\Bbb{grad}f}\Bbb{grad}f,Y)=(\Bbb{grad}f)g(\Bbb{grad}f,Y)-g(\Bbb{grad}f,\nabla_{\Bbb{grad}f}Y)$$ $$=(\Bbb{grad}f)[(df)Y]-(df)(\nabla_{\Bbb{grad}f}Y)$$ $$=(\Bbb{grad}f)[(df)Y]-(df)([\Bbb{grad}f,Y]+\nabla_Y\Bbb{grad}f)$$ $$=(\Bbb{grad}f)[(df)Y]-(\Bbb{grad}f)[(df)Y]+(df)\nabla_Y\Bbb{grad}f$$ $$=g(\Bbb{grad}f,\nabla_Y\Bbb{grad}f)=\frac{1}{2}\nabla_Yg(\Bbb{grad}f,\Bbb{grad}f)=0$$ Therefore $$\nabla _{\Bbb{grad}f}\Bbb{grad}f=0$$ Result follows.