6
$\begingroup$

If $f \in L^{p_0}(X,M,\,u)$ for some $0, then $$1. \lim_{p\to0}\int_X |f|^p \, d\mu=\mu(\{x \in X \mid f(x) \ne0\}).$$

And if additional assume $\mu(X)=1$,

then I wanna prove that $f \in L^p(X,M,\,u)$ for some $0

, and the equation below. $$2. \lim_{p\to0}\|f\|_p=e^{\int_X\log|f|\,d\mu}$$

I want to know how can I conclude those results. First, I'm trying to use integrate over the set on which $0<|f(x)|\le1$ and use MCT. and the set on which $|f(x)|>1$ use LDCT to prove that. But I can't conclude to measure $\mu$ and how can I approach second fact?

  • 0
    for the first problem, using $MCT$ and $LDCT$ as you say lets you interchange the limit and integral, so you get $\int_X \lim_{p \rightarrow 0} |f|^p = \int_X 1(f(x) \neq 0)$2012-06-11
  • 1
    and I think that if $\mu(X) = 1$, you mean it follows that $f \in L^p$ for every $p \leqslant p_0$, because we have $$\int |f|^p \leqslant 1 + \int_{|f| \geqslant 1} |f|^{p_0} < \infty$$2012-06-11
  • 0
    I followed your nice approach so i concluded first equation. But I still cannot understand your second answer so I cannot make it for second equation. Can you explain second inequality more precisely? Thanks @uncookedfalcon2012-06-11
  • 1
    oh not really an answer, all I was saying was that to talk about $\lim_{p \rightarrow 0} \| f \|_p$, you need that $f$'s in every $p$ smaller than $p_0$, and that's true because $$\int |f|^p = \int_{|f| < 1} |f|^p + \int_{|f| \geqslant 1} |f|^p $$$$ \leqslant \int_{|f| < 1} 1 + \int_{|f| \geqslant 1} |f|^p \leqslant 1 + \int_{|f| \geqslant 1} |f|^{p_0} < \infty$$2012-06-11
  • 0
    Oh thanks again for your kind explain @uncookedfalcon.2012-06-11

2 Answers 2

4

For the second equation: first assume that $f$ does not vanish on a set of positive measure. We have

$$\log\|f\|_p = \frac{1}{p} \log(\int_X |f|^p d\mu) .$$

We apply L'hopital's rule to take the limit as $p\rightarrow 0$. Since $|f|^p \log |f|$ is bounded by either a constant or $|f|^{p_0}$ for small $p$, we can differentiate under the integral sign to get

$$\frac{d}{dp} \log(\int_X |f|^p d\mu) = \frac{\int_X \log |f| * |f|^p d\mu}{\int_X |f|^p d\mu}.$$

Of course the derivative of the denominator $p$ is just 1. Therefore

$$\lim_{p\rightarrow 0} \log\|f\|_p = \lim_{p\rightarrow 0}\frac{\int_X \log |f| * |f|^p d\mu}{\int_X |f|^p d\mu} = \frac{\int_X \log |f| d\mu}{\int_X 1 d\mu} = \int_X \log |f| d\mu ,$$

by dominated convergence. It follows that

$$\lim_{p\rightarrow 0} \|f\|_p = e^{\int_X \log |f| d\mu } .$$

If $f = 0$ on a set $E$ of positive measure, then by H\'older's inequality (with $p_0^* = p_0/(1-p_0)$),

$$\int_X |f|^p d\mu = \int_X \chi_{E^c} |f|^p d\mu \leq \||f|^p\|_{p_0} \|\chi_{E^c}\|_{p_0^*} = (\int_X |f|^{p p_0})^{1/p_0} \mu(E^c)^{1/p_0^*}.$$

Thus,

$$\|f\|_p \leq \|f\|_{p p_0} \mu(E^c)^{1/pp_0^*}.$$

The first term here is bounded for small $p$ and the second tends to 0 as $p\rightarrow 0$ since $\mu(E^c) < 1$. Thus $\|f\|_p \rightarrow 0$, which equals $e^{\int_X \log |f| d\mu}$ if we interpret $e^{-\infty}$ as 0.

  • 0
    It's no doubtfully perfect nice answer!2012-06-11
  • 0
    @wowhapjs Sorry, Hölder Inequality is NOT valid for $p_0<1$. So, for the proof above to work in the case "$f=0$ on a ser $E$ of positive measure", we need to include the assumption that $p_0 \geqslant 1$.2015-09-13
3

$1$. Define $$ \begin{align} E^>&=\{x\in X:|f(x)|\ge1\}\\ E^<&=\{x\in X:0<|f(x)|<1\}\\ E^=&=\{x\in X:|f(x)|=0\} \end{align}\tag{1a} $$

On $E^>$, $|f(x)|^p$ decreases to $1$ as $p$ decreases to $0$; on $E^<$, $|f(x)|^p$ increases to $1$ as $p$ decreases to $0$; and on $E^=$, $|f(x)|=0$ as $p$ decreases to $0$.

Therefore, by monotone convergence on $E^<$ and dominated convergence on $E^>$, $$ \begin{align} \lim_{p\to0^+}\int_{E^>}|f(x)|^p\,\mathrm{d}x&=\mu(E^>)\\ \lim_{p\to0^+}\int_{E^<}|f(x)|^p\,\mathrm{d}x&=\mu(E^<)\\ \lim_{p\to0^+}\int_{E^=}|f(x)|^p\,\mathrm{d}x&=0 \end{align}\tag{1b} $$ Summing these yields $$ \lim_{p\to0^+}\int_X|f(x)|^p\,\mathrm{d}x=\mu(\{x\in X:|f(x)|\not=0\})\tag{1c} $$


$2$. Preliminaries

For $p\gt0$ and $t\ge0$, define $$ g_p(t)=\frac{t^p-1}{p}\tag{2a} $$ Claim: $g_p(t)$ is non-decreasing in both $p$ and $t$.

$g_p(t)$ is non-decreasing in $t$: This follows from $$ g_p^\prime(t)=t^{p-1}\ge0\tag{2b} $$

$g_p(t)$ is non-decreasing in $p$: As Didier commented, this follows from $$ g_p(t)=\int_1^tu^{p-1}\,\mathrm{d}u\tag{2c} $$ and because $u^{p-1}$ is non-decreasing in $p$ when $u\ge1$ and non-increasing in $p$ when $0\le u\le1$.

Furthermore, L'Hopital says $$ \lim_{p\to0^+}g_p(t)=\log(t)\tag{2d} $$

$\hspace{1pt}$

Jensen's Inequality says that $h(p)=\|f\|_p$ is non-decreasing in $p$.

$\hspace{1pt}$

Consider an $\epsilon$ neighborhood of $-\infty$ to be $(-\infty,-\frac1\epsilon)$ and let $L=\lim\limits_{p\to0^+}\log(h(p))$.

For any $\epsilon>0$, choose $q>0$ so that $\log(h(q))$ is within an $\frac{\epsilon}{2}$ neighborhood of $L$.

Choose $r>0$ so that $g_r(h(q))$ is within an $\epsilon$ neighborhood of $L$.

If $p<\min(q,r)$, then both $\log(h(p))$ and $g_p(h(p))$ will be within an $\epsilon$ neighborhood of $L$. Therefore, $$ \lim_{p\to0^+}\log(h(p))=\lim_{p\to0^+}g_p(h(p))\tag{2e} $$

Main Result

Define $E=\{x:|f(x)|>1\}$, then the results above yield $$ \begin{align} \lim_{p\to0^+}\log\left(\|f\|_p\right) &=\lim_{p\to0^+}\frac{\|f\|_p^p-1}{p}\\ &=\lim_{p\to0^+}\int_X\frac{|f(x)|^p-1}{p}\,\mathrm{d}x\\ &=\color{#C00000}{\lim_{p\to0^+}\int_{E}\frac{|f(x)|^p-1}{p}\,\mathrm{d}x} +\color{#00A000}{\lim_{p\to0^+}\int_{X\setminus E}\frac{|f(x)|^p-1}{p}\,\mathrm{d}x}\\ &=\color{#C00000}{\int_{E}\log|f(x)|\,\mathrm{d}x} +\color{#00A000}{\int_{X\setminus E}\log|f(x)|\,\mathrm{d}x}\\ &=\int_{X}\log|f(x)|\,\mathrm{d}x\tag{2f} \end{align} $$ The left limit, in red, is by Dominated Convergence, while the right limit, in green, is by Monotone Convergence. Exponentiate to get $$ \lim_{p\to0^+}\|f\|_p=e^{\int_{X}\log|f(x)|\,\mathrm{d}x}\tag{2g} $$

  • 0
    *We can lift the restriction...* Hmmm... :-) If you can then do it, don't you think?2012-06-15
  • 0
    @did: I completely reworked answer 2 using properties of $g_p(t)=\frac{t^p-1}{p}$ and $\|f\|_p$.2012-07-19
  • 0
    To show that $h_p(t)\geqslant0$, note that $h_p(t)=(t^p/p^2)\cdot(s-\log(1+s))$ with $s=t^{-p}-1$ and that $\log(1+s)\leqslant s$ for every $s\gt-1$, by concavity of log (or by some other argument).2012-07-19
  • 0
    @did: thanks! That looks a bit simpler.2012-07-19
  • 0
    Still simpler, $g_p(t)=\int\limits_1^ts^{p-1}\mathrm ds$ and $p\mapsto s^{p-1}$ is nonincreasing on $s\leqslant1$ and nondecreasing on $s\geqslant1$.2012-07-19
  • 0
    @did: I was just considering that while staring at $g_p^\prime(t)=t^{p-1}$. I hope you don't mind if I use it.2012-07-19
  • 0
    Of course not. $ $2012-07-19
  • 0
    @did: I have added the limiting argument that was most of the point of checking that $g_p(t)$ and $h(p)$ are non-decreasing.2012-07-19
  • 0
    Hi robjohn, do we really need to split the integral in two parts, in in (2f)? It looks to me that we can apply dominated convergence to the whole integral, using $g_r(|f|)$.2015-09-13
  • 0
    @pppqqq: which $r$ should we use to apply [Dominated Convergence](https://en.wikipedia.org/wiki/Dominated_convergence_theorem)?2015-09-13
  • 0
    @pppqqq: actually, they both converge by Dominated Convergence, but they still need to be broken up since the red integral is negative and needs to be negated before and after the limit, while the green integral is positive.2015-09-13
  • 0
    Sorry, I meant the $p_0$ of the OP.2015-09-13
  • 0
    @pppqqq: since $p\to0^+$, we can use Dominated Convergence, but we should break things up to be able to use it properly.2015-09-13
  • 0
    Hi robjohn, thank you. I understand the point, on $|f|\leq 1$ the pointwise limit of $g_p(f)$ isn't necessarily a complex measurable function (if $f(x)=0$).2015-09-14
  • 0
    However, I think that the first version of the post (before I started to confuse things up) was the right one: the integral on $|f|\leq 1$ must be evaluated via the monotone convergence theorem (negating before and after the integral, and defining $-\log 0=\infty$): this is so because the hypothesis of the DCT, that is, that the pointwise limit is a complex measurable function, is not met where $f(x)=0$, and this set could have positive measure.2015-09-14
  • 0
    Yes. When you first brought my attention to this, I forgot that the integral requiring Monotone Convergence was negative and thought that since it was a decreasing function, Dominated Convergence was what to use. However, as we both noticed, when a negative function is decreasing, we need to use Monotone Convergence. I have reverted the answer to where it was before.2015-09-14