-1
$\begingroup$

Suppose $u = u(x)$ is a continuous function defined on the finite interval $[a, b]$ and $M =\max\{u(a), u(b)\}$.

(a) If $u(x)$ satisfies $u''(x) + g(x)u'(x) > 0$ for any $x\in(a, b)$, where $g(x)$ is a bounded function on $[a, b]$, then show that $u(x) < M$ for any $x\in(a, b)$.

(b) If $u(x)$ satisfies $u''(x) + g(x)u'(x) ≥ 0$ for any $x\in(a, b)$ and if there exists a $x_0\in∈ (a, b)$ such that $u(x_0) = M$, then $u(x) = M$ for any $x\in[a, b]$.

  • 0
    You have to add your try.2017-02-02

1 Answers 1

1

Hint: I will help you start the problem. Let us use the usual method for determine global max and min on an interval to solve this problem.

For a function $u$ on $[a, b]$ the maximum value either occurs at an interior critical point or at the boundary points, i.e. \begin{align} \max_{a\leq x \leq b} u(x) = \max\left\{u(a), u(b),u(x_c)\right\} \end{align} where $x_c$ stands for all critical points in the interval $(a, b)$.

Hence it suffices to show that either the critical points are not maximum value or there's no critical points.

Let $x_c$ be a critical point of $u(x)$, then it follows \begin{align} u''(x_c)+g(x_c)u'(x_c) = u''(x_c)>0 \end{align} which means $x_c$ is a local minimum.