7
$\begingroup$

I would like to know if every sufficiently differentiable function is convex near a local minimum. The background to my question is that I became curious if one could motivate the usefulness of convex optimization techniques by saying that at least locally they work for all continuous functions.

Unfortunately I discovered there are $C^1$-functions which have minima where they are not locally convex. But my construction involves the use of everywhere non-differentiable continuous functions so I started to wonder if perhaps $C^2$-functions are convex near a local minimum.

2 Answers 2

13

No, this isn't true. Let $\phi(x)$ be a nonnegative smooth "bump" function that is zero except on $(0,1)$ where it is positive. Then $\psi(x) = \sin^2({1 \over x})\phi(x)$ is a smooth nonnegative function (define $\psi(0) = 0$) which has zeroes at $x = {1 \over k\pi}$ for positive integers $k$ but is positive between these zeroes. $\psi(x)$ will still have a local minimum at zero because $\psi(0) = 0$, but because of the humps in $\psi(x)$ between zeroes, a chord connecting $({1 \over k\pi},0)$ to $({1 \over (k +1)\pi},0)$ will lie below the graph. So $\psi(x)$ is not convex.

I should add that if $x_0$ is a local minimum of $f(x)$ such that $f^{(l)}(x_0)$ is nonzero for some $l > 0$, then it will be convex on some interval centered at $x_0$ as long as $f(x)$ is $C^{l+1}$. To see this, note that without loss of generality we may assume $l$ is minimal. By Taylor expanding one gets $f(x) = {1 \over l!} f^{(l)}(x_0)(x - x_0)^l + O((x - x_0)^{l+1})$ If $|x - x_0|$ is sufficiently small the remainder term will be dominated by the ${1 \over l!} f^{(l)}(x_0)(x - x_0)^l$ term. Thus the only way a local minimum can occur is if $l$ is even and $f^{(l)}(x_0) > 0$. Next note that the second derivative of $f$ has Taylor expansion given by ${d^2 f \over dx^2} = {1 \over (l-2)!} f^{(l)}(x_0)(x - x_0)^{l-2} + O((x - x_0)^{l-1})$ The remainder term is domninated by the first term once again, which is nonnegative on an interval containing $x_0$ since $l$ is even and $f^{(l)}(x_0) > 0$. Thus the second derivative of $f$ is nonnegative on this interval and therefore the function is convex there.

  • 0
    Your added argument can be extended to higher dimensions: If $f$ is a $C^{2}$ function having vanishing gradient at $x_{0}$ and positive definite Hessian $Hf(x_{0}$ at $x_{0}$ (in particular $x_{0}$ is an isolated minimum) then it is strictly convex in a neighborhood of $x_{0}$. Indeed, the positive definite matrices are open in the space of symmetric matrices, $x \mapsto Hf(x)$ is continuous since $f$ is $C^{2}$ and a function with positive definite Hessian is strictly convex.2011-01-29
0

Say we have y = f(x1,x2,..) for (x1,x2,,) at least two dimensions.

We can construct smooth f such that f(x,0,0..) = a x^2 , f(0,x,0..) = a x^2 , and f(x,x,0..) = b x^2

In other words a smooth pinch.

If we take a chord between (x,0,..) and (0,x,..) then f = a x^2 at the ends. The midpoint is (x,x,..)/sqrt(2), and f = b x^2 / sqrt(2)

So f is not convex if b x^2 / sqrt(2) > a x^2 , or b > sqrt(2) a , which is easily satisifed.

So, a smooth minimum does not have to be convex.

  • 0
    Welcome to MSE. Please use [MathJax](https://math.meta.stackexchange.com/questions/5020/mathjax-basic-tutorial-and-quick-reference).2018-01-14