12
$\begingroup$

Let $(\Omega,\mu)$ be a finite measure space such that $\mu(\Omega)=1$. Suppose $1\leq p \leq \infty$.

Let $\psi \colon L^p(\Omega) \to L^p(\Omega \times \Omega)$ be the map which maps $f$ onto the function $(x,y)\mapsto \frac{1}{2}\big(f(x)+f(y)\big)$. The map $\psi$ is contractive.

1) Is it an isomorphic embedding?

The answer is positive (see below).

Follow-up questions:

2) What is the best constant $c$ in $\|\psi(f)\|_p\ge c\|f\|_p$?

3) Does there exist a bounded projection from $L^p(\Omega \times \Omega)$ onto the range of $\psi$? Remark: the answer to Question 1 imply that the range of $\psi$ is a closed subspace of $L^p(\Omega \times \Omega)$.

  • 0
    Thanks for the comments. If $p=\infty$, the map $\psi$ is really an isometry.2012-08-08
  • 1
    If $1 \leq p <\infty$, $\psi$ is definitely not an isometry but I think (hope) that $\psi$ is an isomorphic embedding.2012-08-08
  • 0
    It's injective, hence an isomorphims on its range.2012-08-08
  • 0
    @DavideGiraudo, but we mus say thay range of $\psi$ is closed to be sure that $\psi$ is an isomorphism on its range.2012-08-08
  • 0
    Unfortunately, it seems for me that the range of $\psi$ is not closed if $1\leq p<\infty$.2012-08-09
  • 0
    @Zouba That's strange, I was sure that the operator $\psi$ is bounded from below, that is $\|\psi(f)\|\ge c\|f\|$ for some $c>0$ ... A bound from below would give an isomorphism, of course. Do you have an idea for a counterexample?2012-08-09
  • 0
    Suppose that $\Omega=[0,1]$ and consider $f=\chi_{[0,\varepsilon]}$. I think that easy computations give $||f||_p=\varepsilon^{\frac{1}{p}}$ and $||\psi(f)||_p=\varepsilon^{\frac{2}{p}}$. We deduce that a such one constant $c$ cannot exist by passing to the limit $\varepsilon\to 0$. Do you agree with me?2012-08-11
  • 1
    @Zouba No, we have $\psi(f)\ge 1$ on the rectangle $[0,\epsilon]\times [0,1]$, so its $L^p$ norm is of size $\epsilon^{1/p}$. More generally, if $f\ge 0$ on $\Omega$, then $\psi_f(x,y)=\frac12 ( f(x)+f(y))\ge \frac12 f(x)$, hence $\|\psi(f)\|_p\ge\frac12 \|f\|_p$. So the only potential issue is when $f$ has both positive and negative values, creating potential for cancellation.2012-08-11
  • 0
    You are right. I understand my mistake. Thank you! You said `I was sure that the operator is bounded from below'. Do you have a proof?2012-08-12
  • 0
    $f(x)f(y)$ looks very natural, although it is not linear.2012-08-14

2 Answers 2

10

Claim: $\|\psi(f)\|\ge \dfrac15\|f\|_p$ for all $1\le p\le \infty$. (Optimized by @timur).

Proof. The proof applies to either real or complex-valued functions, but I'm going to assume they are real. Consider two cases.

(a) there exists $c\in\mathbb R$ such that $\|f-c\|_p<\dfrac{2}{5} \|f\|_p$. By the triangle inequality $|c|\ge \dfrac{3}{5}\|f\|_p$. The linearity and contractivity of $\psi$ imply $$\|\psi(f)\|_p =\|\psi(c)+\psi(f-c)\|_p \ge \|\psi(c)\|_p - \|\psi(f-c)\|_p \ge |c|-\frac{2}{5} \|f\|_p \ge \frac15 \|f\|_p.$$

(b) for all $c\in\mathbb R$ we have $\|f-c\|_p\ge \dfrac{2}{5} \|f\|_p$. Then for every $y\in \Omega$ $$\left\|\frac12(f(\cdot)+f(y))\right\|_p\,dx \ge \frac{1}{5} \|f\|_p$$ Raising to power $p$ and integrating over $y$ (or, if $p=\infty$, applying the definition of the $L^\infty$ norm), we obtain $\|\psi(f)\|_p\ge \dfrac{1}{5}\|f\|_p$. $\quad\Box$

Follow-up question. What is the best constant $c$ in $\|\psi(f)\|_p\ge c\|f\|_p$? The proof gives $1/5$ and the example $f(x)=\chi_{[0,1/2]}-\chi_{[1/2,1]}$ shows we can't get more than $2^{-1/p}$.

  • 2
    Replacing $\frac15$ in $\|f-c\|_p<\frac15\|f\|_p$ by $0<\gamma<1$ and optimizing gives $c=\frac15$ in $\|\psi(f)\|_p\geq c\|f\|_p$ (with $\gamma=\frac25$).2012-08-14
  • 0
    @timur Thanks, but I don't expect this (a)-(b) split to ever give the sharp constant. As far as examples go, I can't do better than $f(x)=\chi_{[0,1/2]}-\chi_{[1/2,1]}$ for which $c=2^{-1/p}$.2012-08-14
  • 0
    Taking $f(x)=2x-1$ on $[0,1]$ seems to give $c=\frac13$ for $p=1$.2012-08-14
  • 0
    @timur ... and $(2/(p^2+3p+2))^{1/p}$ in general. Nice example.2012-08-14
  • 1
    Now I am a bit confused: With $f$ as above, isn't it $\|f\|_1=\frac12$ and $\|\psi(f)\|_1=\frac13$, hence $c=\frac23$?2012-08-14
  • 0
    @timur Oops, your example appeared right after mine (which had unit norm) and I didn't notice that in your case $f$ is not normalized. I just computed $\|\psi\|$...2012-08-14
  • 0
    Sorry about that. I am conjecturing that $c=2^{-1/p}$ is optimal.2012-08-14
  • 0
    @LeonidKovalev Could you also explore $p=\infty$ case?2012-08-14
  • 0
    @Norbert The map $\psi$ is an isometry for $p = \infty$, as Zouba has already stated in the comments under the question.2012-08-14
  • 0
    @LeonidKovalev: Very Nice proof! I give you the points in a few days since I have an `addon' question: Does there exist a bounded projection from $L^p(\Omega \times \Omega)$ onto the range of $\psi$?2012-08-14
  • 0
    @Zouba I think you should ask Philip Brooker.2012-08-14
  • 0
    This is the first time that I see a such argument. Do you know other situations which uses the same trick?2012-08-16
  • 2
    @Zouba I'm sure it's been used before, but can't tell you where. Concerning your 'addon': I guess there is a projection, simply because you shouldn't get explicit uncomplemented subspaces of $L^p$ this easily. Of course there *is* a canonical projection when $p=2$, and maybe by writing it down explicitly you'll see a way to generalize it. If you get stuck, I recommend asking this as a separate question (with a link here), because not many people will see your question buried in the comments.2012-08-16
  • 0
    Thank you very much. I'm going to think about all that.2012-08-16
  • 1
    I have made some progress towards a proof of the optimal lower bound conjectured by timur. Unfortunately I do not have the time right now to work out the details, but I want to share the idea. Using the density of the subspace generated by characteristic functions of measurable subsets, it is possible to reduce the problem to showing that the discrete inequality $\sum_{i=1}^n\sum_{j=1}^n \vert a_i + a_j \vert^p x_ix_j \geq 2^{p-1}$ holds for all $x_1, \dots, x_n \geq 0$ and $a_1, \dots, a_n \in \mathbb R$ with $x_1 + \dots + x_n = 1$ and $\vert a_1 \vert^px_1 + \dots + \vert a_n \vert^px_n=1$.2012-08-17
  • 0
    This inequality should be provable by induction on $n$, where the proof should be similar to the inductive proof of Jensen's inequality. The case $n=2$ reduces to $\left( \vert 2a_1\vert^p - 2\vert a_1 + a_2\vert^p + \vert 2a_2\vert^p\right)\frac{(\vert a_2\vert^p - 1)^2}{(\vert a_2\vert^p - \vert a_1\vert^p)^2} + 2\left( \vert a_1 + a_2\vert^p - \vert 2a_2\vert^p\right)\frac{(\vert a_2\vert^p - 1)}{(\vert a_2\vert^p - \vert a_1\vert^p)} + \vert 2a_2 \vert^p \geq 2^{p-1}$ for all $a_1,a_2 \in \mathbb R$ with $\vert a_1\vert^p \neq \vert a_2 \vert^p$.2012-08-17
  • 0
    Various plots of the function (for different values of $p$) suggest this to be true. The symmetry of the (general) inequality indicates, that there may also be a more direct proof using some techniques related to Shur convexity.2012-08-17
3

There was asking to prove that the best coercitivity constant $c_p$ for $\psi$ is $2^{-1/p}$. In fact this is not true.

For a given simple function $$ f=\sum\limits_{k=1}^n a_k\chi_{A_k} $$ denote $x_k=\mu(A_k)$. Consider special case $a_1=-1$, $a_2=0$, $a_3=1$ and $x_1=\varepsilon$, $x_2=1-2\varepsilon$, $x_3=\varepsilon$ where $\varepsilon\in(0,2^{-1})$. Then $$ c_p\leq\Vert\psi(f)\Vert_p/\Vert f\Vert_p=(\varepsilon+2^{1-p}(1-2\varepsilon))^{1/p} $$ Since left hand side is independent of $\varepsilon$ we conclude $$ c_p\leq\min_{\varepsilon\in(0,2^{-1})}(\varepsilon+2^{1-p}(1-2\varepsilon))^{1/p}=2^{(1-\max(2,p))/p} $$ But even the bound $$ b_p=2^{(1-\max(2,p))/p} $$ is not rough. Numeric test showed that for $p=3$, $a_1=0.079$, $a_2=0.079$, $a_3=-1$ with $x_1=0.879$, $x_2=0.99$, $x_3=0.022$ gives $$ c_3< 0.612176<0.629960\approx b_3 $$

Here is a Mathematica code to check this

FNorm[a_, x_, n_, p_] := (Sum[Abs[a[[k]]]^p x[[k]], {k, 1, n}])^(1/p); FImageNorm[a_, x_, n_,     p_] := (Sum[      Abs[(a[[k]] + a[[l]])/2]^p x[[k]] x[[l]], {l, 1, n}, {k, 1,        n}])^(1/p); FOpNorm[a_, x_, n_, p_] := FImageNorm[a, x, n, p]/FNorm[a, x, n, p]  OpNorm = 1; A = {}; X = {}; p = 3; With[{n = 3, R = 1, M = 100000}, For[i = 0, i < M, i++,   a = RandomReal[{-R, R}, n];   x = RandomVariate[GammaDistribution[1, 1], n];   x = x/Total[x];   norm = FOpNorm[a, x, n, p];   If[norm < OpNorm, {OpNorm, A, X} = {norm, a, x}, Continue[]];   ]  ] Print[{{OpNorm, 2.^((1-Max[2,p])/p)}, A, X}] 
  • 0
    @LVK I think this answer will be interesting for you2012-09-15
  • 0
    Interesting indeed. Actually, the terms of the bounty were "to prove or disprove.." so your answer qualifies. I guess the natural from here is to understand what's the worst that can happen for $p=1$.2012-09-15
  • 0
    @LVK So your question is what is the norm of $\psi$ when $p=1$?2012-09-16
  • 0
    I'm just thinking this would be the easiest case to understand. Do you have an explicit example with constant less than $1/2$ for $p=1$?2012-09-16
  • 0
    My current guess for the best constant: $2^{-1/p}$ when $p\le 2$ and $2^{(1-p)/p}$ for $p\ge 2$. The last bit is actually odd because it tends to $1/2$ as $p\to \infty$, despite the map being an isometry when $p=\infty$. [Details here](http://calculus7.org/2012/09/25/symmetric-embedding-into-the-squarem-part-ii/)2012-09-26
  • 0
    Oh no, another counterexample. Your latest one actually uses just two sets, since $A_1$ and $A_2$ can be combined. I considered the case $p=4$ (which is computationally simpler than $p=3$) and found $c_4\le 0.5626\dots$, beating $2^{-3/4}=0.5946\dots$ Now I have no guess for $c_p$ when $p>2$.2012-09-30
  • 0
    @LVK Thanks for your bounty! As for the question. I don't think sets $A_1$ and $A_2$ can't be combined because values of $f$ on this sets are different. My computational research that the bound $b_p$ seems to be correct for $p\in[1,2]$, but wrong for other values. It seems to me that one need to restrict its research to the case $\Omega=[0,1]$. My intuition tells me that we need to consider Chebyshev polynomials...2012-09-30