3
$\begingroup$

My calculus knowledge is pretty limited, but unfortunately I need to solve a problem of the following kind:

I'm given a 2 dimensional function $f(x,y)$ from $\mathbb{R}^2$ to $\mathbb{R}$ and I want to know, where it attains its minimum value over $\mathbb{R}\times(a,b)$.

Put differently I want to find an $x$ value and a $y\in(a,b)$ such that f(x,y) \leq f(x',y') for all x' in $\mathbb{R}$ and all $y \in (a,b)$.

I'll have to take the partial derivative of $f$ w.r.t $x$, but I don't understand how y will come into play.

  • 0
    oh yes i was ambiguous, I meant the second version, I edited it.2011-10-23

2 Answers 2

5

$f(x,y)$ has a critical point at $(x,y)$ if the gradient $\left(\frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}\right)$ is the zero vector at that point.

So the procedure you'll want to follow is the following. I assume that $f$ does in fact attain its minium on $\mathbb{R} \times [a,b]$; if it's not obvious for your particular $f$, it's something you'll need to check.

  1. Find the points where the gradient of $f$ vanishes. Throw out critical points with $y$ not in $(a,b)$.
  2. Evaluate $f$ at these points to find the minimum on the interior of your region. (If there are no critical points in the region, skip this step.)
  3. Find the minimum of the one-dimensional functions $f(x,a)$ and $f(x,b)$. This will give you the minima at the boundary of your region.
  4. Evaluate $f$ at the three points from steps 2 and 3. Whichever gives the smallest $f$ is the global minimum.
1

If you are given an analytical function, then the partial derivative is a deterministic approach:

$\frac{\partial f}{\partial x}= 0$

$\frac{\partial f}{\partial y}= 0$

$\frac{\partial^2 f}{\partial x^2}> 0$

$\frac{\partial^2 f}{\partial y^2}> 0$

If you are given a "black box" function, then stochastic or Monte Carlo methods. If you want to go further, this book "Stochastic Adaptive Search for Global Optimization" (2003) is a good guide.