1
$\begingroup$

I have a problem which I am unable to solve. If we consider the following problem $\min f(x)$, $G(x) = b$;

where $f$ is in $C^2(R^n)$, and $G$ from $R^n$ to $R^m$ is a $C^2$-function, $G = (g_1,\ldots , g_m)^t$ (transpose) , and $b \in R^m$.

If $x^{*}$ be a local minimum for the problem when $b = 0$, $y^{*}$ be a corresponding Kuhn-Tucker multiplier and suppose that the gradients $g_j(x^{*}), j = 1, \ldots, m$ are linearly independent. If we suppose additionally that the second order sufficient conditions hold at $x^{*}$. Then,

  • (a) What are the second order sufficient conditions?
  • (b) Prove that for all $b$ with $\|b\|$ sufficiently small, there exist $x(b)$, $y(b)$ which satisfy the necessary first order conditions for a local minimum of the problem. The mappings $b \mapsto x(b)$ and $b \mapsto y(b)$ are $C^1$ in a neighborhood of $0$.
  • (c) Prove that ${\nabla}\; b (f(x(b))) = -y(b)$ for the functions defined in (b).

I don't know how to solve this question, especially part (b) and (c). Please help me!

  • 0
    Pleas use LaTeX syntax and dollar signs for math input.2012-06-19

1 Answers 1

0

Edit: I think that on item (c) you meant to write some notion of the envelope theorem. See these links:

If so, it should be a consequence of the chain rule and the first-order conditions.


I suppose you figured out item (a).

For item (b), write down the system of equations that represents the first-order conditions for the problem. You should have a system with $n + m$ equations: a "derivative equals zero" type of condition for each variable $x$, and a complementary slackness condition for each constraint/multiplier $y$. You also have $n + m$ variables; each one of the original variables plus one Lagrange multiplier per constraint. Now go read the implicit function theorem. It gives you conditions under which you can write the solutions of a system of equations as a function of a parameter. In your case, you want to write $x, y$ as a function of $b$. Look at the condition for the implicit function theorem to hold. You should get that straight from the second order condition you came up with in (a). You will also need the functions in the system to be $C^{1}$... but that should be no problem because $f$ and $G$ are of class $C^{2}$.

I honestly did not understand what you meant on item (c). Were you trying to say that the gradient of $f$ evaluated at $x(b)$ is a linear combination of the gradients of $g_{1}, \ldots, g_{m}$ also at $x(b)$ with weights $y_{1}(b), \ldots, y_{m}(b)$?