1
$\begingroup$

Let $f$ be function that has derivatives of order $2$. Furthermore, $\lim\limits_{x \to 0^+} f(x)=+\infty $ and $f''(x)>0$. prove that $$\lim\limits_{x \to 0^+} f'(x)=-\infty $$

5 Answers 5

1

The original question was : If $f$ has derivatives of order $2$ and $\lim_{x \to 0} f(x) = +\infty$, does that mean $\lim_{x \to 0} f'(x) = -\infty$ and this is no.

Here : I'll suppose $f : \mathbb R / \{0\} \to \mathbb R$ and $f \to +\infty$ as $x \to 0^+$. Take $f(x) = \frac 1x + \sin \left( \frac 1{x^2} \right)$. You obtain $$ f'(x) = \frac{-1}{x^2} -\frac{2 \cos \left( \frac 1{x^2} \right) }{x^{3}} = \frac{-x-\cos \left( \frac 1{x^2} \right)}{x^3} $$ You can take subsequences such that the derivative goes to $-\infty$ as well as $+\infty$ (by letting $x_n$ such that $x_n \to 0$ and $\cos(\frac 1{x^2}) = \pm 1$) so that you don't have an asymptote for $f'$, but rather some horrible behavior (oscillation between $-\infty$ and $+\infty$). Clearly this means the derivative does not necessarily converge to $+\infty$. But if it converges, it does indeed go to $+\infty$, because it can clearly not be bounded or go to $-\infty$ if $f$ goes to $+\infty$...

EDIT : Now for the current question, the answer is yes. Since $f''(x) > 0$, $f'(x)$ is increasing, thus it suffices to find a subsequence of $f'$ which goes to $-\infty$ as $x \to 0$. Consider the interval $(0,1)$. Define $x_1 = 1$ and choose $x_2$ in this interval so that $$ f(x_2) - f(x_1) > 1. $$ which is possible because $f$ goes to infinity. Suppose $x_n$ has been defined and choose $x_{n+1}$ in the interval $(0,x_n)$ such that $$ f(x_{n+1}) - f(x_n) > n . $$ By Taylor's theorem, there exists $c_n \in (x_{n+1},x_n)$ such that $$ -n(x_n-x_{n+1}) > -n > f(x_n)- f(x_{n+1}) = f'(c_n)(x_n-x_{n+1}) $$ which implies $$ -n > f'(c_n) \frac{x_n - x_{n+1}}{x_n-x_{n+1}} = f'(x_n) $$ and $c_n \to 0$ as $n \to \infty$. This gives you $$ \forall n \in \mathbb N, \quad \exists c_n \text{ s.t. } \quad \forall 0 < x \le c_n, \quad f'(c_n) < -n. $$ This means $f'(x) \to -\infty$.

Hope that helps,

  • 0
    thank you I need this example but I have question.how about f'' >0 and f(x)→+∞ as x→0 . I guess f'(x)→-∞ as x→02011-07-20
  • 0
    ...ask it? =) What's your question2011-07-20
  • 0
    @Patrick: the OP edited the question, you may want to re-visit this answer.2011-07-20
  • 0
    I didn't remove my old answer because I thought it was helpful at first. I edited my answer to add the answer to the new question.2011-07-20
  • 0
    Saw other people's answers... and this makes me realize I just love sequences, don't know why. Even though they make things longer sometimes.2011-07-20
1

Notice that $$f''(x)>0$$ It means $f'(x)$ is increasing function, and then, the shape of function $f(x)$ is like this.(approximatlly)

enter image description here

so, you can notice that $$\lim_{x\rightarrow 0^+}f'(x)=-\infty$$

  • 1
    Something is missing here... maybe *rigor*?2011-07-20
  • 1
    Intuition is important. Thanks for the picture. +12011-07-20
1

Note if $\rm\displaystyle\ \!\!\!\!\!\lim_{\quad x \to\: 0^+} f\:\:'(x)\ $ exists then applying L'Hospital's rule we deduce

$$\rm 0\ =\ \!\!\!\!\!\lim_{\quad x \to\: 0^+}\:\dfrac{x}{ f(x)}\: =\ \!\!\!\!\!\lim_{\quad x \to\: 0^+} \dfrac{1}{f\:'(x)} $$

hence $\rm\displaystyle \!\!\!\lim_{\quad x \to\: 0^+} f\:'(x)\ =\: \pm\:\infty,\ $ necessarily $\rm\:-\infty\:$ since $\rm\ f\:'' > 0\ \Rightarrow\ f\:'\:$ increasing.

  • 0
    What if the limit of $f'$ didn't converge to $\pm \infty$? You supposed it did, nothing shows it does.. =) (just messing with you haha)2011-07-20
  • 0
    @Pat No, the above proof only assumes the *existence* of $\rm\:lim \ f'\:.$2011-07-20
  • 0
    Oh, I didn't notice your answer was not trying to answer but was assuming the limit existed. Sorry. I can be a loudmouth sometimes. Heehee2011-07-20
0

By the mean-value theorem, for any $0< x <1$, $$ f(1) - f(x) = f'(\xi)(1-x), $$ for some $\xi \in (x,1)$. Since $f'' > 0$, $f'$ is increasing. So from $\xi > x$, it follows that $f'(\xi) \geq f'(x)$. Hence $$ f(1) - f(x) \geq f'(x)(1-x). $$ Now note that, by the assumption on $f$, $$ \mathop {\lim }\limits_{x \to 0^ + } (f(1) - f(x)) = - \infty , $$ and further note that $$ \mathop {\lim }\limits_{x \to 0^ + } (1 - x) = 1 , $$ to conclude from $f(1) - f(x) \geq f'(x)(1-x)$ that $$ \mathop {\lim }\limits_{x \to 0^ + } f'(x) = - \infty . $$

  • 1
    Why are you posting a duplicate answer to a duplicate question?2011-07-20
  • 0
    I will delete one of the answers later on.2011-07-20
  • 0
    Did it for you. ('cause I merged the two questions)2011-07-20
  • 0
    there are a problem in this solution!you can't solve limit in f(1) - f(x) \geq f'(x)(1-x)2011-07-22
  • 0
    @babgen: There isn't a problem. Note that $f'(x) \le \frac{{f(1) - f(x)}}{{1 - x}} \to - \infty $ as $x \to 0^+$ to conclude that also $f'(x) \to - \infty$ as $x \to 0^+$.2011-07-22
0

Here's how you can make ks0830's idea rigorous; I'll suppose $f$ is defined on the interval $(0,1]$:

Since $f$ is unbounded, $f'$ must also be unbounded (otherwise $f$ would be Lipschitz continuous and hence bounded). Moreover, $f''>0$ implies that $f'$ is increasing, so you conclude that $f'(x) \to -\infty$ as $x\to0$.