Suppose $g \in C^1([a,b]\times\mathbb{R}\times\mathbb{R})$. Let $S = \{ f \in C^1([a,b]): f(a)=a_0, f(b)=b_0\}$. I am trying to show that if $f \in S$ satisfies \frac{d}{dx}g_z(x,f(x),f'(x))= g_y(x,f(x),f'(x)) and for every $x,y$ we have $z\longmapsto g(x,y,z)$ is convex, then $f$ minimizes J[f] = \int_a^bg(x,f(x),f'(x))dx over $S$.
I see that if I had that for every $x$ the function $(y,z) \longmapsto g(x,y,z)$ is convex, then a routine integration by parts does the trick. But I am completely lost without the stronger convexity assumption.
Does anyone have any help for me?
EDIT: It might be helpful to know the trick for when we have convexity in the last two arguments, so here is what I meant: (suppose $f_0$ satisfies the properties)
g(x,f,f') \geq g(x,f_0,f_0') + g_y(x,f_0,f_0')(f-f_0) + g_z(x,f_0,f_')(f'-f'_0)
integrate, by parts on the last term
\implies J[f] \geq J[f_0] + \int_a^bg_y(x,f_0,f_0')dx + \left(\left[g_z(x,f_0,f_0')(f-f_0) \right]_a^b -\int_a^bg_y(x,f_0,f_0')dx \right) note that the boundary terms disappear since $f_0(a)=f_0(b)$ $ \implies J[f] \geq J[f_0] + 0 + 0. $
EDIT 2: I spent all day trying to prove this myself and still have made no progress. Please if anyone has help for me I would greatly appreciate it.