2
$\begingroup$

Let $f$ be a continuous function on $[a,b)$, $f$ twice differentiable in $(a,b)$ so that $f''(x)>0$ for each $x \in (a,b)$. Prove that if $\lim_{x\to b-}f(x) =\infty $ then $ \lim _{x\to b-}f'(x)=\infty $

4 Answers 4

0

Because of the notation $b^-$ I presume that $b$ must be finite. Suppose $ m:=\lim_{x \to b^-}f'(x)<\infty. $ Since $f''>0$ we deduce that $f'$ is increasing on $(a,b)$. Therefore

$ f'(x)< m \quad \forall \ x \in (a,b). $ Thus for every $x \in [a,b)$ we have \begin{eqnarray} f(x)&=&f(a)+\int_a^xf'(t)dt=f(a)+(x-a)m+\int_a^x(f'(t)-m)dt\cr &\le& f(a)+(x-a)|m|\le f(a)+|m|(b-a). \end{eqnarray} It follows that $ \lim_{x \to b^-}f(x) \le f(a)+|m|(b-a), $ which contradicts the fact that $f(x) \to \infty $ as $x \to b$.

1

Fix $c$ with $a

Then for $c\le x, \begin{equation}f(x)=f(c)+\int_c^xf'(x)dx\end{equation} by fundamental theorem of calculus. If $f'(x)$ tended to a finite limit, then $f'(x)$ would have been bounded in a neighbourhood $(b-\delta,b)$. Also, $f'$ being derivable is continuous and will be bounded on the compact interval $[a,b-\delta]$. Hence $f'$ is bounded on $[a,b)$. Let $M$ be a bound for $f'$. Hence for $x\ge c$, $f(x)\le f(c)+(b-c)M$.

Again, since $f$ is continuous on the compact interval $[a,c]$ it is bounded there. This shows that $f$ is bounded on $[a,b)$ contrary to the hypothesis that $f$ blows up as $x$ gets arbitrarily close to $b$.

Any fallacies?


Let me add concluding statement.

Hence the assumption of $\lim_{x\rightarrow b^-}f'(x)<\infty$ was wrong. Hence the limit of $f'$ is infinity.

  • 0
    Using the assumption of boundednesss of $f'$, I've also given a bound for $f$ on $[c,b)$. So here $f$ is bounded on $[a,b)$2012-07-08
0

As $f(x) \to \infty$ as $x \to b^-$, we know there exist some $b_i$ in $(a,b)$ s.t. for all $x \in (b_i, b)$, $x \geq i$. We also know that $f$ is continuous, and that the derivative is always increasing. So some ways to proceed would be either of the following:

  1. By considering, say, $\{10^n\}_{n \in \mathbb{N}}$, and the corresponding $b_{10^n}$, you can just use the mean value theorem.

  2. Suppose not. $f'(x)$ is monotone increasing, and thus if $a_n = b - 1/2^n$ or something similar, the sequence $f'(a_n)$ will have a limit. By our assumption, it won't be infinite. But then you can bound $\lim_{x \to b^-} f(x)$.

Or you could use the MVT in some other way. I suspect that there are many different approaches to this problem relying on the MVT (and the fact that $f'(x)$ is increasing).

0

Suppose $|f'|$ is bounded around $b$. Then for any $x$ and $y$ close to $b$, $ |f(x)-f(y)| \leq \left( \sup |f'| \right) |x-y|, $ and therefore $\lim_{x \to b-} f(x)$ exists and is finite. This contradiction implies that $|f'|$ is unbounded around $b$. But $f'$ is monotonically increasing, and therefore $f'(x) \to +\infty$ as $x \to b^{-}$.