28
$\begingroup$

Exercise 3.32 page 119 of Convex Optimization is concerned with the proof that if $f:\mathbb{R}\rightarrow\mathbb{R}:x\mapsto f(x)$ and $g:\mathbb{R}\rightarrow\mathbb{R}:x\mapsto g(x)$ are both convex, nondecreasing (or nonincreasing) and positive, then $h:\mathbb{R}\rightarrow\mathbb{R}:x\mapsto h(x)=f(x)g(x)$ is also convex.

Are there any generalisations or analogies of this claim to the case where both $f$ and $g$ are convex functions mapping elements from $\mathbb{R}^n$ to $\mathbb{R}$, for any $n>1$?

  • 0
    Why is it necessary that $f,g$ be monotonous?2018-09-26

1 Answers 1

30

Yes a generalization is possible. Here is an elementary approach to the convexity of the product of two nonnegative convex functions defined on a convex domain of $\mathbb{R}^n$.

Choose $x$ and $y$ in the domain and $t$ in $[0,1]$. Your aim is to prove that $\Delta\ge0$ with $\Delta=t(fg)(x)+(1-t)(fg)(y)-(fg)(tx+(1-t)y). $ But $f$ and $g$ are nonnegative and convex, hence $ (fg)(tx+(1-t)y)\le(tf(x)+(1-t)f(y))(tg(x)+(1-t)g(y)). $ Using this and some easy algebraic manipulations, one sees that $\Delta\ge t(1-t)D(x,y)$ with $ D(x,y)=f(x)g(x)+f(y)g(y)-f(x)g(y)-f(y)g(x), $ that is, $ D(x,y)=(f(x)-f(y))(g(x)-g(y)). $ This proves a generalization of the result you quoted to any product of convex nonnegative functions $f$ and $g$ such that $D(x,y)\ge0$ for every $x$ and $y$ in the domain of $f$ and $g$.

In particular, if $f$ and $g$ are differentiable at a point $x$ in the domain, one asks that their gradients $\nabla f(x)$ and $\nabla g(x)$ are such that $z^*M(x)z\ge0$ for every $n\times 1$ vector $z$, where $M(x)$ is the $n\times n$ matrix $ M(x)=\nabla f(x)\cdot(\nabla g(x))^*. $

  • 0
    Thanks a lot for your answer and all the details!2011-03-18