4
$\begingroup$

$\displaystyle X= \left \{\frac{x^2}{2} + \frac{y^2}{3} + \frac{z^2}{6} \leq 1 \right \}$ is a compact set

If $f(x,y,z)$ is continuous on $X$, then for any $\epsilon \gt 0$, there exists a polynomial $p(x,y,z)$ such that $|f - p|\lt \epsilon$ on $X$.

I need to prove this and I have no idea how.

  • 3
    Do you know the Weierstrass approximation theorem?2011-05-10
  • 0
    Do you need to exhibit such $p$ *explicitly*? This is a difficult problem in approximation theory (as far as I know). On the contrary, proving the lone existance of $p$ is easy if you have some tools - the Stone-Weierstrass theorem, for example.2011-05-10
  • 0
    The line up there got cut off it should be |f - p|2011-05-10
  • 0
    but what we went over in class was how to prove this with only the x variable,, and now I have to do it in three dimensions. I do Know the Weierstrass theorem but im not sure how to adapt it into three dimensions.2011-05-10
  • 0
    Can you expand a little bit on how exactly you did it for one variable? There are many proofs of that theorem.2011-05-11
  • 0
    it was a lot like the general proof at the bottom of this page http://planetmath.org/encyclopedia/ProofOfWeierstrassApproximationTheorem.html2011-05-11
  • 1
    Okay, very good then. [This proof](http://planetmath.org/encyclopedia/ProofOfWeierstrassApproximationTheoremInRn.html) should thus be accessible for you.2011-05-11
  • 1
    @dissonance, the proof of the Weierstrass Approximation Theorem by Bernstein is quite explicit. See http://en.wikipedia.org/wiki/Bernstein_polynomial#Approximating_continuous_functions2011-05-11
  • 0
    @lhf: Oh yes, I had heard of that proof during a course in probability theory. However, I do not see an immediate generalization of that construction from the interval $[0, 1]$ to the set $X$ the OP refers to. Should we use vector-valued random variables, maybe...? This is the difficulty I referred to.2011-05-11
  • 0
    @dissonance, I haven't checked any details, but doesn't tensor products work? I mean, $B_n(x)B_n(y)B_n(z)$?2011-05-11
  • 0
    @lhf: I think that there is some more work to do. For a univariate $f\colon [0, 1] \to \mathbb{R}$, its $n$th Bernstein polynomial is $$B_n[f](x)=\sum_{k=0}^n {n \choose k} x^k (1-x)^{n-k}f\left(\frac{x}{k}\right).$$ How to extend this to a multivariate $F$? If it is defined on $[0, 1] \times [0,1] \times [0, 1]$, we can build a tensor product of Bernstein polynomials, maybe like this: $$B_{n+m+l}[F](x, y, z)=B_n[F(\cdot, y, z)](x)B_m[F(x, \cdot, z)](y)B_l[F(x, y, \cdot)](z).$$ Will this thing converge to $F$? Also, what to do if $F$ is not defined on a cube, but on something different?2011-05-11
  • 0
    A reference on multivariate Bernstein polynomials: [Korovkin-type theorems and approximation by positive linear operators](http://www.math.technion.ac.il/sat/papers/13/13.pdf) pag.114 (23). The author talks about Bernstein polynomials on the hypercube and on the $n$-simplex. From this last construction, one could build an explicit approximation by means of Bernstein polynomials over any compact convex subset of $\mathbb{R}^n$.2011-05-11

1 Answers 1

1

Here is a way to exhibit the polynomials.

First step: on the unit cube As already said by others, if $f$ was defined and continuous on the cube $K=[0,1]^3$, a straightforward modification of Bernstein construction would do. Namely, one could approximate $f$ at $u=(x,y,z)$ in $K$ by $$ E\left(f\left(\frac{X^u_1+\cdots+X^u_n}n\right)\right), $$ for an i.i.d. sequence $(X^u_n)$ of random variables with mean $E(X_n^u)=u$ and support in $K$. For example, using Bernoulli random variables, the sequence of polynomials $(B_n(f))$ would converge uniformly to $f$ on $K$, with $$ B_n(f)(x,y,z)=\sum_{1\le i,j,k\le n}b^n_{i,j,k}(x,y,z)f\left(\frac{i}n,\frac{j}n,\frac{k}n\right), $$ where $B_n(f)$ is based on the elementary polynomials $b^n_{i,j,k}$, defined as $$ b^n_{i,j,k}(x,y,z)={n\choose i}x^i(1-x)^{n-i}{n\choose j}y^j(1-y)^{n-j}{n\choose k}z^k(1-z)^{n-k}. $$ Second step: on another cube Likewise, to approximate a continuous function $f$ defined on $K_3=[-3,3]^3$, at every point $(x,y,z)$ in $K_3$ one could use $$ B^{(3)}_n(f)(x,y,z)=\sum_{1\le i,j,k\le n}b^n_{i,j,k}\left(\frac{x+3}6,\frac{y+3}6,\frac{z+3}6\right)f\left(\frac{6i-3n}n,\frac{6j-3n}n,\frac{6k-3n}n\right), $$ Third step: on subsets of a cube For any $Y\subset K_3$ and any continuous function $f$ on $Y$, an obvious idea is to use the polynomial $B^{(3)}_n(g)$, where $g:K_3\to\mathbb R$ is defined by $g(x,y,z)=f(x,y,z)$ on $Y$ and $g(x,y,z)=0$ on $K_3\setminus Y$. The function $g$ is not continuous everywhere but $g$ is bounded and this is enough to ensure that $B^{(3)}_n(g)(x,y,z)$ converges to $g(x,y,z)$ at every point $(x,y,z)$ where $g$ is continuous. In particular $B^{(3)}_n(g)(x,y,z)$ converges to $f(x,y,z)$ at every point $(x,y,z)$ in the interior of $Y$. One can furthermore show that the convergence is uniform on every compact set included in this interior.

Last step: on $X$ All this yields the existence of polynomials approaching any function defined and continuous on $X$, uniformly on every $(1-t)X$. Now, if $f$ can be extended to a continuous function $\tilde f$ defined on $(1+t)X$ for a given positive $t$, since $X$ is a compact subset of the interior of $(1+t)X$, applying the preceding step to $\tilde f$ would yield polynomials $B^{(3)}_n(\tilde g)$ approximating $f$ uniformly on $X$.

To complete the proof, we now explain how to extend any continuous function $f:X\to\mathbb R$ to a continuous function $\tilde f:K_3\to\mathbb R$ in such a way that $\tilde f$ and $f$ coincide on $X$. Define $\tilde f$ at $u$ in $K_3\setminus X$ as follows. Start a 3D Brownian motion $(W^u_t)$ at $W^u_0=u$ and consider the hitting times $\tau_X$ and $\tau_K$ of the boundaries of $X$ and $K_3$ respectively by $(W^u_t)$. Then the random time $\min\{\tau_X,\tau_K\}$ is almost surely finite and one considers $$ \tilde f(u)=E(f(W^u_{\tau_X});\tau_X<\tau_K). $$ The continuity of $\tilde f$ on $K_3\setminus X$ and on the interior of $X$ are obvious and its continuity at the boundary of $X$ is standard. This completes the proof.