5
$\begingroup$

The question is: if $f_i$ is a sequence of functions in $L^p$ converging to $f$ and $g_i$ a sequence in $L^q$ converging to $g$ show that $f_ig_i$ converges to $fg$ in $L^1$ for $p,q$ finite and $\frac{1}{p}+\frac{1}{q}=1$. Does this result hold if $p=1, q=\infty$?

So I think I showed the first part. $||f_ig_i-fg||_1=||f_ig_i-f_ig+f_ig-fg||_1\leq||f_ig_i-f_ig||_1+||f_ig-fg||_1 = ||f_i(g_i-g)||_1+||g(f_i-f)||_1\leq||f_i||_p||g_i-g||_q+||g||_q||f_i-f||_p$

(by Holder's inequality) which goes to $0$ as $i\rightarrow\infty$.

However the second part of the question throws me off because I don't see why this proof doesn't work just as well for $p=1,q=\infty$. Am I missing something? Does the second part also hold?

  • 2
    You are right, Hölder's inequality holds for $q=\infty$. To complete the argument, you have to say that $\sup_i\lVert f_i\rVert_p$ is finite.2012-12-05

1 Answers 1