1
$\begingroup$

Let $A\subseteq \mathbb{R}^n$ be the region defined with

$A = \Big\{ (x_1,\ldots,x_n)\in\mathbb{R}^n \mid 0 \le x_i < 1\text{ for all $i$, and } \sum x_i < 1 \Big\}.$

Let $\mathbf{u} =(u_1,\ldots,u_n)$ be an arbitrary unit vector with $u_i>0$ for all $i$.

Let $f: A\to \mathbb{R}$ be a multivariate function. And now, define $f_{\mathbf{u}}(t) = f(t\mathbf{u})$. Clearly, this is just the multivariate function $f$ going in the direction of the unit vector $\mathbf{u}$.

Denote as usual with $\mathbf{e}_i$, for some $i=1,\ldots,n$, the basis unit vectors, e.g. $\mathbf{e}_1 = (1,0,\ldots,0)$.

We assume that

  1. $f_{\mathbf{u}}'(0) = u_1f_{\mathbf{e}_1}'(0) + \cdots + u_nf_{\mathbf{e}_n}'(0)$, i.e. that the derivative of $f$ at $0$ in any direction is just a linear combination of the derivatives in the basis directions.

  2. $f_{\mathbf{u}}''(t) < 0$, i.e. that the derivative $f_{\mathbf{u}}'(t)$ is strictly decreasing. In other words: The slope of $f_{\mathbf{u}}(t)$ is decreasing as $t$ increases.

  3. $f_{\mathbf{u}}(t) \to -\infty$ as $t\to 1/\sum u_i$ for any direction $\mathbf{u}$. In other words: In any direction, the multivariate function $f$ approaches minus infinity as the boundary of the region is approached.

My questions are these: Can we say something interesting about $f$ in general? In particular: Can we say something about the number of local maxima it has? Can we prove that it has at most $n-1$ local maxima? For $n=2$, it seems intuitively clear that it can have only 1 maximum, and I was wondering whether it would be possible to generalize this to $n$ dimensions.

  • 0
    @meh Of course, 1 is unnecessary if the function is smooth. To answer the original question: the function can have arbitrarily many local maxima. I'll try to write down an example below. – 2012-06-18

1 Answers 1

1

Consider $n=2$. Pick an integer $m$ and define $f(x_1,x_2)=\sin(m\theta)+g(x_1+x_2)$ where $\theta\in [0,\pi/2]$ is the polar angle of the point $(x_1,x_2)$, the function $g : [0,1)\to\mathbb R$ is strictly concave and satisfies $g(0)=0$, $g(1/2)=1$ (which is the maximum of $g$), and $g(t)\to-\infty$ as $t\to 1$. I will not try to cook up $g$ explicitly, but it can be done without much work. Note that $f$ satisfies all assumptions of the problem.

The absolute maximum of $f$ is equal to $2$ and is attained only at the points where $x_1+x_2=1/2$ and $\theta \in \{\pi/(2m)+2\pi k/m: k\in\mathbb Z\}$. The larger $m$ is, the more such points we have.

$f$ can be smooth near the origin with a slightly more complicated construction. We can make sure that $g'\le -1$ everywhere (all that matters is that $\sup g'<0$). Let $f(x_1,x_2)=\phi(x_1+x_2)\sin(m\theta)+g(x_1+x_2)$ where $\phi$ is a nondecreasing smooth function such that $|\phi''|\le 1/2$ everywhere, $\phi(t)=0$ when $t<1/4$, and $\phi(t)=c>0$ when $t\ge 1/2$. (Informally, it's a smoothened Heaviside function). Now, the global maximum of $f$ is $1+c$ and is attained at the same points as in the first version. Since $\phi$ vanishes for small $t$, there is no problem with $f$ near the origin.

  • 1
    @meh This does not change the matter, see the edited answer. Intuition behind the counterexample: think of the graph of $z=g(x_1+x_2)$ as made of clay. You can create grooves in it by running a finger in the direction away from the origin. If you do it carefully, concavity will be preserved. The grooves will separate the set where $f$ attains its global maximum into many parts. – 2012-06-18