Let $a,b\in[0,\infty)$ and let $p\in[1,\infty)$. How can I prove$a^p+b^p\le(a^2+b^2)^{p/2}.$
How to prove an L$^p$ type inequality
1
$\begingroup$
real-analysis
inequality
-
1To expand on the comment by @DavideGiraudo, take the $p$'th root and consider $(a^p+b^p)^{1/p}$ as a function of $p$. – 2012-05-27
1 Answers
1
Some hints:
- By homogeneity, we can assume that $b=1$.
- Let $f(t):=(t^2+1)^{p/2}-t^p-1$ for $t\geq 0$. We have $f'(t)=p((t^2+1)^{p/2-1}-t^{p-1})$. We have $t^2+1\geq t^2$, so the derivative is non-negative/non-positive if $p\geq 2$ or $p<2$.
- Deduce the wanted inequality (when it is reversed or not).