1
$\begingroup$

Let $f:[a,b]\rightarrow \mathbb R$ be a continuous function such that

$\frac{f(x) - f(a)}{x-a}$

is an increasing function of $x\in [a,b]$. Is $f$ necessarily convex? What if we also assume that

$\frac{f(b) - f(x)}{b-x}$

is increasing in $x$?

  • 0
    I am adding the condition that $f$ be continuous.2012-07-18

2 Answers 2

2

A counterexample (to both) can be constructed from the function $f(x)=x^2$ on the interval $[0,1]$ by adding a little bump to the graph, say, near the point $(1/2,1/4)$.

  • 0
    Ah! I thought you meant a vertical shift to give a jump discontinuity. That makes more sense.2012-07-18
0

Yes $f$ is necessarily convex (with the added assumption continuous), if $ \displaystyle x\mapsto \frac{f(x)-f(a)}{x-a}$ is increasing over $(a,b]$. Let $a and define $t\in (0,1)$ by $x=ta+(1-t)y$. We have $\displaystyle \frac{f(x)-f(a)}{(1-t)(y-a)} = \frac{f(x)-f(a)}{x-a}\leqslant \frac{f(y)-f(a)}{y-a}$, hence $f(x)\leqslant tf(a) +(1-t)f(y)$.