The question is: if $f_i$ is a sequence of functions in $L^p$ converging to $f$ and $g_i$ a sequence in $L^q$ converging to $g$ show that $f_ig_i$ converges to $fg$ in $L^1$ for $p,q$ finite and $\frac{1}{p}+\frac{1}{q}=1$. Does this result hold if $p=1, q=\infty$?
So I think I showed the first part. $||f_ig_i-fg||_1=||f_ig_i-f_ig+f_ig-fg||_1\leq||f_ig_i-f_ig||_1+||f_ig-fg||_1 = ||f_i(g_i-g)||_1+||g(f_i-f)||_1\leq||f_i||_p||g_i-g||_q+||g||_q||f_i-f||_p$
(by Holder's inequality) which goes to $0$ as $i\rightarrow\infty$.
However the second part of the question throws me off because I don't see why this proof doesn't work just as well for $p=1,q=\infty$. Am I missing something? Does the second part also hold?