Let $f$ be a sequentially continuous real function defined on $(a,b)$. Suppose $\exists \lambda \in (0,1)$ such that $\forall x,y \in (a,b)$, $f(\lambda x + (1-\lambda) y) ≦ \lambda f(x) + (1-\lambda) f(y)$.
Then, how do I prove $f$ is convex?
I proved that $f$ is convex when $\lambda=1/2$, but i think it would work for arbitrary $\lambda \in (0,1)$.
Let $A=\{m\in [0,1]|\forall x,y \in (a,b), f(mx+(1-m)y)≦m f(x) + (1-m) f(y)\}$.
It can be easily seen that "$j,k\in A \Rightarrow \lambda j + (1-\lambda) k \in A$".
The problem is i don't know how to show that $A$ is dense in $[0,1]$. ($A$ is dense when $\lambda=1/2$)
Help!