Let $f:[a,b]\rightarrow \mathbb R$ be a continuous function such that
$$\frac{f(x) - f(a)}{x-a}$$
is an increasing function of $x\in [a,b]$. Is $f$ necessarily convex? What if we also assume that
$$\frac{f(b) - f(x)}{b-x}$$
is increasing in $x$?
Let $f:[a,b]\rightarrow \mathbb R$ be a continuous function such that
$$\frac{f(x) - f(a)}{x-a}$$
is an increasing function of $x\in [a,b]$. Is $f$ necessarily convex? What if we also assume that
$$\frac{f(b) - f(x)}{b-x}$$
is increasing in $x$?
A counterexample (to both) can be constructed from the function $f(x)=x^2$ on the interval $[0,1]$ by adding a little bump to the graph, say, near the point $(1/2,1/4)$.
Yes $f$ is necessarily convex (with the added assumption continuous), if $ \displaystyle x\mapsto \frac{f(x)-f(a)}{x-a}$ is increasing over $(a,b]$.
Let $a