13
$\begingroup$

Stuck, help, please, I am really new to this. I opened the 2-norm and multiplied by $n$, then I am thinking to square both sides. The problem is that I do not know how to open $(x_1 + x_2 + ... + x_n)^2$

$\|x\|_1 \leqslant \sqrt n\cdot \|x\|_2$

  • 0
    $(|x_1| + \dots + |x_n|)^2 = \sum_{i=1}^n \sum_{j=1}^n |x_i||x_j|$. You want absolute values in the expansion of $\| x \|_1$.2012-10-07
  • 0
    this is known, but how it is helping me to answer the main question?2012-10-07
  • 1
    You wanted to know how to expand it, I did? =)2012-10-07

4 Answers 4

14

Hint: This is Cauchy-Schwarz inequality applied to the vectors $x=(|x_k|)_{1\leqslant k\leqslant n}$ and $y=(y_k)_{1\leqslant k\leqslant n}$ defined by $y_k=1$ for every $k$.

  • 0
    OK thanks, got it2012-10-07
  • 0
    The most elegant answer suggested. Thanks.2014-07-06
  • 0
    Should it be $(|x_1|,\cdots,|x_n|)$?2014-08-18
  • 0
    @blue Yes, this is probably even simpler using these. Thanks.2014-08-18
10

Using Cauchy-Schwarz, we get:

$$ \Vert x\Vert_1= \sum\limits_{i=1}^n|x_i|= \sum\limits_{i=1}^n|x_i|\cdot 1\leq \left(\sum\limits_{i=1}^n|x_i|^2\right)^{1/2}\left(\sum\limits_{i=1}^n 1^2\right)^{1/2}= \sqrt{n}\Vert x\Vert_2 $$

5

We will expand and then complete squares Square up both sides and expand then you get $$ \sum _{i=1}^{n} \sum _{j=1}^{n} |x_i| |x_j| \leq n\sum _ {i=1}^{n} x_i^2$$

$$\sum _{i=1}^{n} \sum _{j=1}^{n} |x_i| |x_j|= \sum_ {1\leq i

So it suffices to prove $$\sum_ {1\leq i But $$\sum_ {1\leq i $$\sum_ {1\leq i

But $$\sum_ {1\leq i$$=(n-1)\sum_ {i=1}^{n}x_i^2$$ And we are done

1

Here's a different approach if you don't know Cauchy-Schwarz.

It suffices to show the inequality in the case where all the coordinates are positive since both norms don't change when we swap signs of the coordinates. So consider $D = \{ x = (x_1, \dots, x_n) \, | \, x_i \ge 0 , \| (x_1, \dots, x_n) \|_1 = C \}$ for some $C > 0$ and the function $$ f(x_1, \dots, x_n) = n \sum_{i=1}^n x_i^2 - \left( \sum_{i=1}^n x_i \right) \left( \sum_{j=1}^n x_j \right). $$ The set $D$ is compact, so $f$ attains a minimum over $D$ because $f$ is a polynomial, hence continuous. We will show this minimum is zero.

Using Lagrange multipliers, one can find where the minimum of $f$ lies at. We compute : $$ \frac{\partial f}{\partial x_k} = 2n x_k - 2 \left( \sum_{i=1}^n x_i \right) $$ hence $$ \nabla f(x_1, \dots, x_n) = 2n(x_1, \dots, x_n) - 2 \left( \sum_{i=1}^n x_i, \sum_{i=1}^n x_i, \dots, \sum_{i=1}^n x_i \right). $$ One can write the constraint as $g(x_1, \dots, x_n) = x_1 + \dots + x_n$, hence using Lagrange multipliers gives us the following constraint on the minimum : there exists $\lambda \in \mathbb R$ such that $$ \lambda (1,\dots,1) = \lambda \nabla g(x_1, \dots, x_n) = \nabla f(x_1, \dots, x_n) = 2n (x_1, \dots, x_n) - \left( \sum_{i=1}^n x_i, \dots, \sum_{i=1}^n x_i \right) $$ This means all coordinates are equal, i.e. $x_1 = \dots = x_n = \frac{\lambda + \sum_{i=1}^n x_i}{2n}$. Plugging this into $f$ gives $f(x_1, \dots, x_1) = 0$. We know that the minimum of $f$ does not lie on the boundary of $D$, because on the boundary of $D$ one of the coordinates of $f$ is zero, thus we can use induction on $n$ (the case $n=1$ is trivial because then $\|x\|_1 = \|x\|_2$ for all $x$). Therefore the minimum of $f$ is zero, hence $f$ is always positive, which gives $$ n \sum_{i=1}^n x_i^2 \ge \left( \sum_{i=1}^n x_i \right) \left( \sum_{j=1}^n x_j \right) \quad \Longrightarrow \quad \sqrt n \| x \|_2 \ge \| x \|_1. $$ Hope that helps!

  • 0
    There exists Lagrange multipliers provided the extremum is reached **in the interior of the domain**. Hence, to complete this approach, one should show that the extremum is not reached at one or at several boundary points.2012-10-07
  • 0
    @did : I did, by induction. On the boundary of the domain, one of the coordinates is zero. It's possible you read my question while I was still editing. Please read again if you are still in doubt.2012-10-07
  • 0
    i think i need to go with Schwarz since i know that one :)2012-10-07
  • 0
    Indeed you were still editing, sorry about that. You write *We know that the minimum of $f$ does not lie on the boundary of $D$*: sorry but I am not sure I follow the argument you provide in support (especially since this is not true for $n=1$ and you mention an induction over $n$...).2012-10-07
  • 0
    For $n=1$ the inequality $\| (x_1) \|_1 \le \| (x_1)\|_2$ holds trivially, so in particular $\| (x_1,0) \|_1 < \sqrt 2 \| (x_1,0) \|_2$ on the boundary of the case $n=2$. So the case $n=2$ works, hence on the boundary of the case $n=3$ you can use a similar trick and so on. Am I clear enough or do you wish for me to explicitly work it out?... (I'm still wondering if you're the downvoter, by the way...)2012-10-07
  • 1
    This argument looks fine to me, you definitely could include it in your post. (By the way, I am wondering why you are wondering who is the downvoter, since to wonder who is voting what on this site is to engage oneself in a recognizedly futile endeavour--and no I did not downvote, but I resent having been forced to declare this.)2012-10-07
  • 0
    @did : I like to know who downvotes because I am ready to accept critic but not hatred. Downvoting for me is some kind of warning ; "respect others and stop posting bad things otherwise you'll be downvoted to warn you" or something.2012-10-07