Suppose I have a function $F(x,D) = ||y-Dx||_2^2$, such that $x^{*}(D)= \displaystyle arg \min_{x} F(x,D)$ (that is given $y$ and for a fixed $D$) and subject to some constraint $h(x) <\epsilon$, where $h(x)$ is a convex function. Now let $G(D) = \displaystyle \min_{x} F(x,D) $ given the constraint. I need to find $\frac{d}{dD} G(D)$. How do I express it in terms of $\frac {d}{dD}{x^{*}(D)}$ ? It should have something to do with the Lagrangian of the optimization problem $F(x,d)$?
Gradient/sub-differential of a minimum of a function
2
$\begingroup$
optimization
calculus-of-variations
-
0Thanks for the remark. I have edited the question. – 2012-08-31