2
$\begingroup$

As we learned in calculus, if a function $f:\mathbb{R} \rightarrow \mathbb{R}$ is twice differentiable, and has a local maximum at point $x_0$, then f'(x_0) = 0 and f''(x_0) \leq 0. In addition, it's not hard to show.

What I have trouble with is the higher dimensional analogue:

Let $f:\mathbb{R}^n \rightarrow \mathbb{R}$ be twice differentiable, and has a local maximum at $x_0\in \mathbb{R}^n$, then $Df(x_0) = 0$ and $D^2f(x_0) \leq 0$ (i.e. the symmetric matrix $-D^2f(x_0)$ is positive semidefinite).

1 Answers 1

2

You can deduce this from Taylor's formula for the multi variable case. The below article is taken from my lecture notes. Actually the link for this article isn't working now. I shall add the link when it starts working.

Assume that $0 \in U$, $U$ is star shaped at $0$ and that we wish to find a 'Taylor expansion' of $f$ at $0$. Consider the one variable function $g(t) = f(tx)$. We first observe that this is defined for $t \in (-\epsilon, 1+\epsilon)$ for some sufficiently small $\epsilon >0$. For, since $U$ is open and $0 \in U$, there exists a $\epsilon_{1}>0$ such that $B(0,\epsilon_{1}) \subset U$. Hence $0 +tx \in B(0,\epsilon_{1})$ provided that $||tx|| = |t|\cdot ||x|| < \epsilon_{1}$, that is when $|t| < \frac{\epsilon_{1}}{||x||}$. A similar consideration will show that $x+tx= (1+t)x \in B(x,\epsilon_{2}) \subset U$ if $|t| <\frac{\epsilon_{2}}{||x||}$. If we choose $\epsilon = \min \{\epsilon_{1}/||x||, \epsilon_{2}/||x||\}$ then $tx \in U$ for $t \in (-\epsilon,1+\epsilon)$. Let us now show that $g$ is differentiable on this interval and compute it's derivative.

\begin{align*} g'(t) = \lim_{h \to 0} \frac{g(t+h)-g(t)}{h} &= \lim_{h \to 0} \frac{f((t+h)x -f(tx)}{h} \\ &=\lim_{h \to 0} \frac{f(tx+hx)-f(tx)}{h} \\ &=D_{x}f(tx) \equiv Df(tx)(x) \\ &= \sum\limits_{i=1}^{n}\frac{\partial{f}}{\partial{x_i}}(tx)x_{i} \end{align*}

In particluar g'(0)= \sum\limits_{i=1}^{n} \frac{\partial{f}}{\partial{x_i}} (0)x_{i} Similarlay one can find that g''(t) = \sum\limits_{i,j=1}^{n} \frac{\partial^{2}{f}}{\partial{x_j}\partial{x_i}} (tx)x_{j}x_{i}

Now, we can apply Taylor's theorem of one variable calculus to $g$. We get g(t)=g(0)+g'(0)t + g''(0)t^{2} + R where the remainder $R$ is such that $\lim_{t \to 0} R/t^2=0$. Taking $t=1$ in the above equation we get the Taylor's formula for $f$. $ f(x)=f(0) + \sum\limits_{i=1}^{n}\frac{\partial{f}}{\partial{x_i}}(0)x_{i} + \sum\limits_{i,j=1}^{n} \frac{\partial^{2}{f}}{\partial{x_j}\partial{x_i}} (0)x_{j}x_{i} +R \qquad (\text{I})$

From (I) it is easy to deduce the sufficient condition (in terms of second order partial derivatives) for the local maximum/minimum for $f$. If we assume for instance that $x=0$ is a point of local maximum, then $t=0$ is a point of local maximum for $g$. Hence g''(0) \leq 0. Coming back to $f$ this implies that $ \sum\limits_{i,j=1}^{n} \frac{\partial^{2}{f}}{\partial{x_j}\partial{x_i}} (0)x_{j}x_{i} \leq 0$ for all choices of $x$ in a neighborhood of $0$. This is same as saying that the matrix $ D^{2}f(0) = \left(\begin{array}{cccccc} \frac{\partial^{2}{f}}{\partial{x_{1}^{2}}}(0) & \frac{\partial^{2}{f}}{\partial{x_1}\partial{x_2}}(0) & \cdot & \cdot & \cdot & \frac{\partial^{2}{f}}{\partial{x_1}\partial{x_n}}(0) \\\ & \cdot \\\ & \cdot \\\ & \cdot \\\ \frac{\partial^{2}f}{\partial{x_n}\partial{x_1}}(0) & \frac{\partial^{2}{f}}{\partial{x_n}\partial{x_2}}(0) & \cdot & \cdot & \cdot & \frac{\partial^{2}{f}}{\partial{x_{n}^{2}}}(0) \end{array}\right)$

is negative semi-definite.

  • 0
    You used an $\epsilon-\delta$ argument above to show that $f(p+t(x-p))$ is defined. But since $U$ is star shaped w.r.t. the point p, for any $x$ in $U$ the line segment $p +(x-p)t$ lies in $U$. So $f(p +(x-p)t)$ is defined for $0 \leq t \leq 1$; can't you just say this directly?2011-07-03