3
$\begingroup$

I have the inequality

$f''(x)x + f'(x) \leq 0$

Also, $f''(x)<0$ and $f'(x)>0$ and $x \in R^+$. And I need to figure out when it is true. I know it is a fairly general question, but I couldn't find any information in several textbooks I have skimmed. Also, I am not sure if integrating would require a sign reversal or not, so I cant go ahead and try to manipulate it my self.

Any help or mention of a helpful source would be much appreciated.

edit: forgot to mention $f(x)\geq 0$ for every $x \in R^+$

2 Answers 2

2

$0\geq f''(x)x+f'(x)=(f'(x)x)'$

that is why the function $f'(x)x$ decreases for positive reals. Then the maximum should be at $0$, so $f'(x)x\leq 0,\quad x \in R^+$ which is a contradiction.

1

No.

It is not true that if $f(x) \geq 0$, $f'(x) >0$ and $f''(x) < 0$ for all $x \in \mathbb{R}^+$, then $x f''(x) + f'(x) \leq 0$.

Below are a class of counter examples.

Consider $f(x) = 1-\exp(-x) > 0$. We then have that $f'(x) = \exp(-x) > 0$ and $f''(x) = -\exp(-x) < 0$. However, $xf''(x) + f'(x) = -x\exp(-x) + \exp(-x) = (1-x) \exp(-x)$ which is negative only when $x>1$.

In general, you can consider $f(x) = 1 - \exp(-\alpha x) > 0$, where $\alpha > 0$.

We then have that $f'(x) = \alpha \exp(-\alpha x) > 0$ and $f''(x) = -\alpha^2 \exp(-\alpha x) < 0$.

Hence, we get that $x f''(x) + f'(x) = -\alpha^2 x\exp(-\alpha x) + \alpha \exp(-\alpha x) = \alpha \exp(-\alpha x) (1 - \alpha x)$

The above is negative only when $x > \dfrac1{\alpha}$.