2
$\begingroup$

My question is about finding the extrema of a multidimensional function, $f:\mathbb{R}^n\rightarrow \mathbb{R}$. From lecture I know that

$H_f(x_0) < 0 $ implies a isolated maximum

$H_f(x_0) > 0 $ implies a isolated minimum

$H_f(x_0)$ indefinite implies a saddle point

Where $H_f(x_0)$ is the Hesse-Matrix at point $x_0$. So how can I identify non isolated maxima? What about the cases where $H_f$ is positive (or negative) semidefinite? What if $H_f = 0$?

I know of a specialized test for the $\mathbb{R^2}$, which answers the above questions, but I wonder about the $\mathbb{R^n}$. What rules can be applied there?

  • 0
    What does it mean for a matrix to be larger then a number?2011-06-12
  • 1
    Probably only ad-hoc reasoning. If your Hessian is semidefinite, it means in some directions the directional derivative is zero. Find eigenvalues for those directions, then study your function in those directions as functions of one variable.2011-06-12
  • 3
    matrix > 0 means: matrix is positive definite. And < 0 means it is negative definite.2011-06-12
  • 0
    If $H_f=0$ that criterion doesn't make a statement. You have to study the function in that region "by hand".2011-06-12
  • 0
    Ohh, this seems that there is really no generall kriterium. I really have to do this by hand?2011-06-12
  • 0
    @ftiaronsem: What you presented above IS the general criterion, and usually you have to do some calculations in order to reach the desired result. :)2011-06-12
  • 0
    @Beni Ok, thanks. I just hoped that there would be something allowing me to study the remaining cases like non isolated maxima, etc... But it seems like there is no such criterion, what a pity... :-(2011-06-12
  • 2
    @ftiaronsem: there is a criterion involving higher derivatives to handle some cases wherw $H_f$ is zero. It can be found in oldish calculus books (I can only think of Rey Pastor's *Análisis Matemático* right now, which is in Spanish...) There is no 100% effective criterion to decide which only involves derivatives (of any order) at a point: one can construct examples of functions all of whose derivatives vanish at a point and which have there a maximum, or a minimum, or whatever you like.2011-06-16

1 Answers 1

1

Assume that $f$ is of class $C^2$ in a ball $B_r(x_0)$, and 1) $\nabla f(x_0) = 0$, 2) $H_f(x) \geq 0$ for every $x\in B_r(x_0)$. Then $x_0$ is a local minimum. Indeed the second assumption implies that $f$ is convex in $B_r(x_0)$, so that condition 1) is sufficient in order to have a minimum point at $x_0$. (A similar statement holds true if the Hessian matrix is negative semidefinite in a neighborhood of $x_0$.)

  • 0
    $H_f(x)>0$ for every $x$ near $x_0$ shows local minimum. But $\ge 0$ does not. Not even in 1 dimension.2011-06-16
  • 0
    I could hardly find such a counter-example...2011-06-16
  • 0
    You are right, a whole neighborhood will do. It is just possibly not a strict local minimum. If a function is convex, then the tangent plane at a point is below (or on) the graph.2011-06-16