14
$\begingroup$

Stuck, help, please, I am really new to this. I opened the 2-norm and multiplied by $n$, then I am thinking to square both sides. The problem is that I do not know how to open $(x_1 + x_2 + ... + x_n)^2$

$\|x\|_1 \leqslant \sqrt n\cdot \|x\|_2$

  • 1
    You wanted to know how to expand it, I did? =)2012-10-07

4 Answers 4

15

Hint: This is Cauchy-Schwarz inequality applied to the vectors $x=(|x_k|)_{1\leqslant k\leqslant n}$ and $y=(y_k)_{1\leqslant k\leqslant n}$ defined by $y_k=1$ for every $k$.

  • 0
    @blue Yes, this is probably even simpler using these. Thanks.2014-08-18
15

Using Cauchy-Schwarz, we get:

$ \Vert x\Vert_1= \sum\limits_{i=1}^n|x_i|= \sum\limits_{i=1}^n|x_i|\cdot 1\leq \left(\sum\limits_{i=1}^n|x_i|^2\right)^{1/2}\left(\sum\limits_{i=1}^n 1^2\right)^{1/2}= \sqrt{n}\Vert x\Vert_2 $

5

We will expand and then complete squares Square up both sides and expand then you get $ \sum _{i=1}^{n} \sum _{j=1}^{n} |x_i| |x_j| \leq n\sum _ {i=1}^{n} x_i^2$

$\sum _{i=1}^{n} \sum _{j=1}^{n} |x_i| |x_j|= \sum_ {1\leq i

So it suffices to prove $\sum_ {1\leq i But $\sum_ {1\leq i $\sum_ {1\leq i

But $\sum_ {1\leq i $=(n-1)\sum_ {i=1}^{n}x_i^2$ And we are done

1

Here's a different approach if you don't know Cauchy-Schwarz.

It suffices to show the inequality in the case where all the coordinates are positive since both norms don't change when we swap signs of the coordinates. So consider $D = \{ x = (x_1, \dots, x_n) \, | \, x_i \ge 0 , \| (x_1, \dots, x_n) \|_1 = C \}$ for some $C > 0$ and the function $ f(x_1, \dots, x_n) = n \sum_{i=1}^n x_i^2 - \left( \sum_{i=1}^n x_i \right) \left( \sum_{j=1}^n x_j \right). $ The set $D$ is compact, so $f$ attains a minimum over $D$ because $f$ is a polynomial, hence continuous. We will show this minimum is zero.

Using Lagrange multipliers, one can find where the minimum of $f$ lies at. We compute : $ \frac{\partial f}{\partial x_k} = 2n x_k - 2 \left( \sum_{i=1}^n x_i \right) $ hence $ \nabla f(x_1, \dots, x_n) = 2n(x_1, \dots, x_n) - 2 \left( \sum_{i=1}^n x_i, \sum_{i=1}^n x_i, \dots, \sum_{i=1}^n x_i \right). $ One can write the constraint as $g(x_1, \dots, x_n) = x_1 + \dots + x_n$, hence using Lagrange multipliers gives us the following constraint on the minimum : there exists $\lambda \in \mathbb R$ such that $ \lambda (1,\dots,1) = \lambda \nabla g(x_1, \dots, x_n) = \nabla f(x_1, \dots, x_n) = 2n (x_1, \dots, x_n) - \left( \sum_{i=1}^n x_i, \dots, \sum_{i=1}^n x_i \right) $ This means all coordinates are equal, i.e. $x_1 = \dots = x_n = \frac{\lambda + \sum_{i=1}^n x_i}{2n}$. Plugging this into $f$ gives $f(x_1, \dots, x_1) = 0$. We know that the minimum of $f$ does not lie on the boundary of $D$, because on the boundary of $D$ one of the coordinates of $f$ is zero, thus we can use induction on $n$ (the case $n=1$ is trivial because then $\|x\|_1 = \|x\|_2$ for all $x$). Therefore the minimum of $f$ is zero, hence $f$ is always positive, which gives $ n \sum_{i=1}^n x_i^2 \ge \left( \sum_{i=1}^n x_i \right) \left( \sum_{j=1}^n x_j \right) \quad \Longrightarrow \quad \sqrt n \| x \|_2 \ge \| x \|_1. $ Hope that helps!

  • 0
    @did : I like to know who downvotes because I am ready to accept critic but not hatred. Downvoting for me is some kind of warning ; "respect others and stop posting bad things otherwise you'll be downvoted to warn you" or something.2012-10-07