1
$\begingroup$

Let $f:\mathbb{R^2}\to\mathbb{R}$ be a $C^2$ function and let $(x_0,y_0)\in\mathbb{R^2}$. Show for any $\epsilon>0$ there is a $\delta>0$ such that if $0<|h|<\delta$ and $0<|k|<\delta$, then

\begin{equation*} \left|\frac{f(x_0+h,y_0+k)-f(x_0+h,y_0)-f(x_0,y_0+k)+f(x_0,y_0)}{hk}-\frac{\partial^2f}{\partial x\partial y}(x_0,y_0)\right|<\epsilon \end{equation*}

I have been stuck on the above problem for some time now. I have tried breaking the problem up into one-dimensional difference quotients by considering the function $f$ as a single-variable function when held fixed in either the $x$ or $y$ variable, but then habit keeps bringing me back to try using the Mean-Value Theorem, and I'm not making much progress on that.

If anyone is willing to give a terse proof or proof sketch for me to fill in the blanks, it'd be appreciated.

  • 0
    WLOG $(x_0,y_0)=(0,0)$. Looking at the 2nd order Taylor approximation, $f(h,k)-f(h,0)-f(0,k)+f(0,0) = f_{xy}(0,0)hk + R(h,k)$ and this amounts to showing $\lim\limits_{(h,k)\to (0,0)}\dfrac{R(h,k)}{hk}=0$. For what that's worth; it just rephrases the problem but then allows you to see references on Taylor's theorem for details. You can see that summarized with reference to Königsberger Analysis 2, p. 64 ff. on Wikipedia here: https://en.wikipedia.org/wiki/Taylor's_theorem#Taylor.27s_theorem_for_multivariate_functions2017-02-14
  • 0
    Thanks, I will try to construct a proof by reverse-engineering the 2nd order Taylor approximation. Since that section is a bit further on in the text it'll probably take a tiny bit of work to come back to how it was intended to be proven, but it should be a good place to start. Cheers!2017-02-14

1 Answers 1

1

It may be cheating to answer my own question after such a long time. But hopefully this answer can help someone in the future and doesn't use as strong of machinery as was suggested to me two months ago.

Let $\Delta=f(x_0+h,y_0+k)-f(x_0+h,y_0)-f(x_0,y_0+k)+f(x_0,y_0)$, and for fixed $x_0$, $y_0$, $h$, and $k$, define the single-variable functions \begin{equation} \beta_{y_0}(x) = \frac{\partial f}{\partial y}(x,y_0)\\ \psi(y) = f(x_0+h,y)-f(x_0,y) \end{equation}

Then using the Mean Value Theorem we have $c\in(y_0,y_0+k)$ and $d\in(x_0,x_0+h)$

\begin{align*} \Delta &= \psi(y_0+k)-\psi(y_0)\\ &=\psi^{\prime}(c)(y_0+k-y_0)\\ &=(\beta_c(x_0+k)-\beta_c(x_0))\\ &=(\beta_c^\prime(d)(x_0+h-x_0)\\ &=\frac{\partial^2 f}{\partial x \partial y}(c,d)\ hk \end{align*}

So \begin{equation} \frac{\Delta}{hk}-\frac{\partial^2 f}{\partial x \partial y}(c,d)=0 \end{equation}

Hand-waving from here out, let $h, k\to0$, then $(c,d)\to(x_0,y_0)$.