0
$\begingroup$

Suppose $g \in C^1([a,b]\times\mathbb{R}\times\mathbb{R})$. Let $S = \{ f \in C^1([a,b]): f(a)=a_0, f(b)=b_0\}$. I am trying to show that if $f \in S$ satisfies $$ \frac{d}{dx}g_z(x,f(x),f'(x))= g_y(x,f(x),f'(x)) $$ and for every $x,y$ we have $z\longmapsto g(x,y,z)$ is convex, then $f$ minimizes $$ J[f] = \int_a^bg(x,f(x),f'(x))dx $$ over $S$.

I see that if I had that for every $x$ the function $(y,z) \longmapsto g(x,y,z)$ is convex, then a routine integration by parts does the trick. But I am completely lost without the stronger convexity assumption.

Does anyone have any help for me?

EDIT: It might be helpful to know the trick for when we have convexity in the last two arguments, so here is what I meant: (suppose $f_0$ satisfies the properties)

$$ g(x,f,f') \geq g(x,f_0,f_0') + g_y(x,f_0,f_0')(f-f_0) + g_z(x,f_0,f_')(f'-f'_0) $$

integrate, by parts on the last term

$$ \implies J[f] \geq J[f_0] + \int_a^bg_y(x,f_0,f_0')dx + \left(\left[g_z(x,f_0,f_0')(f-f_0) \right]_a^b -\int_a^bg_y(x,f_0,f_0')dx \right) $$ note that the boundary terms disappear since $f_0(a)=f_0(b)$ $$ \implies J[f] \geq J[f_0] + 0 + 0. $$

EDIT 2: I spent all day trying to prove this myself and still have made no progress. Please if anyone has help for me I would greatly appreciate it.

  • 0
    How much regularity are you willing to assume? A priori for any $f\in S$, the expression $g_z(x,f(x),f'(x))$ may not be differentiable in $x$...2012-02-13
  • 0
    The set $S$ is a subset of $C^1([a,b])$ which are functions which are differentiable with continuous first derivative. Similarly $g$ is of class $C^1$ on its domain.2012-02-14
  • 0
    Woops, ignore previous comment. We are assuming $f$ satisfies the Euler-Lagrange equation. This of course requires $g_z(x,f(x),f'(x))$ to be differentiable in $x$.2012-02-14

2 Answers 2

1

A counterexample: let $g(x,y,z) = -4 y^2 + z^2$ which is strictly convex in $z$ when $y$ is fixed. The Euler Lagrange equations read $$ g_z(x,y,z)= 2z\quad g_y(x,y,z)= -8 y \implies f'' = -4 f $$

Let the boundary conditions be $f(0) = f(\pi) = 1$. A solution to the ELE is $f(x) = \cos (2x)$.

$$J[f] = \int_0^\pi -4 \cos^2(2x) + 4\sin^2(2x) \mathrm{d}x = 0 $$

On the other hand the constant function $h(x) = 1$ satisfies

$$ J[h] = \int_0^\pi -4 \mathrm{d}x = -4\pi < J[f] $$

so $f$ is not a minimiser.

In fact, if you take the family of functions

$$h_k(x) = 1 + 4kx(\pi-x) $$

one can compute

$$J[h_k] = -4 \pi - \frac{4}{3} k\pi^3 - \frac{2}{15}k^2 \pi^5 + \frac{1}{3} k^2\pi^3 $$

Using that $\pi^2 \frac{2}{5} > 18/5 > 1$ we have that not only do all of $J[h_k]$ for $k > 0$ satisfy $J[h_k] < 0$. We have that

$$ \lim_{k\to\infty} J[h_k] = -\infty $$

so you functional is in fact unbounded from below over $S$.

  • 0
    I think you left out a $k$ somewhere... $h_k$ doesn't depend on $k$?2012-02-14
  • 0
    @Kb100: thanks. fixed.2012-02-14
0

Consider the function $g(t,x,y) = y$. Then the functions $(x,y) \mapsto g(t,x,y) = y$ and $y \mapsto g(t,x,y) = y$ are convex. Furthermore $$ g_{x} = \frac{d}{dt}g_{y} = 0 . $$ For each $f \in S$ we have $$ \int_{a}^{b}g(t,f(t),\dot{f}(t))dt = \int_{a}^{b}\dot{f}(t)dt = [f(t)]_{a}^{b} = f(b) - f(a) = b_{0} - a_{0}. $$ Therefore there are infinitely many solutions.

  • 0
    You are correct that your example does not have a minimum, however you left out one of my assumptions. For $g(x,y,z) = y$ we have $g_z =0$ and so $\frac{d}{dx}g_z = 0 \neq g_y = 1$ for any $f$. This theorem says that if there is a function $f$ satisfying the Euler-Lagrange equation, and $z \longmapsto g(x,y,z)$ is convex for every $x,y$, then $f$ minimizes $J$.2012-02-11
  • 0
    Your reasoning this time is also correct, but you used an extra assumption. The statement $g(x,f,f')\geq g(x,f_0,f_0')+g_y(x,f_0,f_0')(f-f_0)+ _z(x,f_0,f_0')(f'-f_0')$ assumes convexity on the last two arguments; this case I already showed that if $f_0$ has the desired property, then it is a minimum, so if $f_0,f$ both have the desired property, then they must be equal (as there can only be one minimum). The weaker convexity assumption only lets you say $g(x,f,f') \geq g(x,f,f_0') + g_z(x,f,f_0')(f'-f_0')$.2012-02-12
  • 0
    Ask this question in http://mathoverflow.net/. Probably it will be answered.2012-02-13