Let $x\in\mathbb{R}^{n}$. Then $x=x_{1}e_{1}+\dots+x_{n}e_{n}$. If $||x||$ is any norm in $\mathbb{R}^{n}$, is it true that $$ \lVert x_{j}e_{j}\rVert\le\left\lVert\sum_{k=1}^{n}x_{k}e_{k}\right\rVert=\lVert x\rVert $$ where $1\le j\le n$?
Inequality involving norms in $\mathbb{R}^{n}$.
-
2No. Take the Euclidean norm, $n = 2$ and $e_1, e_2$ which are very close to parallel. – 2012-04-02
-
0@QiaochuYuan What do you mean? $e_{1},\dots,e_{n}$ are the standard basis in $\mathbb{R}^{n}$. – 2012-04-02
-
1If you're willing to take an arbitrary norm, there's no reason not to take an arbitrary basis too (if the change of basis matrix is $B$ you can consider $||Bv||$ instead of $||v||$). So an equivalent way of saying what I said above is to take the Euclidean norm with respect to, say, $e_1$ and $e_1 + 0.0001e_2$. – 2012-04-02
-
0@Qiaochu: I don't follow. With the Euclidean norm, $\rm \|a\|\le \|a+b\|$ for any $\rm a,b$ orthogonal. – 2012-04-02
-
0What if $e_{1},\dots,e_{n}$ must be the standard basis in $\mathbb{R}^{n}$? – 2012-04-02
-
0@Qiaochu: I think the question pertains exactly to some orthonormal basis $\{e_i\}$of $\mathbb{R}^n$ with the norm recovered from the inner product as follows. With $v = \sum_{i}a_ie_i$, $\|v\| = \left(\sum_{i,j}\langle a_ie_i, a_je_j\rangle^2\right)^{1/2} = \left(\sum_i a_i^2\right)^{1/2}$. Then the inequality that the OP is proposing is of course true. – 2012-04-02
-
0@MathMajor: See my comment above. In particular, notice that $\|v\| = \left(\sum_i a_i^2\right)^{1/2}$. Can you conclude from this your proposed inequality? – 2012-04-02
-
0@William: the question does not specify that the norm has to come from an inner product. – 2012-04-02
-
0@Qiaochu: though he implicitly assumes an orthonormal basis. I think that's what he really meant. I am not disagreeing with you, only that I "feel" the OP meant that basic case. So maybe you could edit your answer to have this case included. – 2012-04-02
-
0@William: I think you are wrong to assume the OP intended for the norm to derive from an inner product. – 2012-04-02
2 Answers
My comment seems to be engendering some confusion, so I'll spell out explicitly what I mean. Pick any real $t \neq 0$ and take the norm $$||y_1 e_1 + y_2 (e_1 + t e_2)|| = y_1^2 + y_2^2.$$
Then we find that $$|| t e_2 || = 2 > || e_1 +t e_2 || = 1.$$
If you're willing to change norms, there's no reason you shouldn't also be willing to change bases. For any norm $||v||$ and any invertible linear transformation $B : \mathbb{R}^n \to \mathbb{R}^n$, considering the new norm $||Bv||$ (as a numerical function of the coefficients $x_i$ with respect to the standard basis $e_1, ... e_n$) is equivalent to working with the old norm but as a function of the coefficients with respect to the new basis $B e_1, B e_2, ... B e_n$.
So let's work with the ordinary Euclidean norm on $\mathbb{R}^2$ but use arbitrary basis vectors $f_1, f_2$. Then the inequality $||x_1 f_1|| \le ||x_1 f_1 + x_2 f_2||$ is clearly false if, say, $x_1 = 1, x_2 = -1$ and $f_1, f_2$ are unit vectors which are close to parallel. Translating this counterexample into a changed norm instead of a changed basis gives the above.
-
0This is of course fine. However, speaking in more general terms, the OP is asking this: say we have defined an inner product on $\mathbb{R}^n$, and with respect to this inner product we have an orthonormal basis $\{e_i\}$. Define the Euclidean square-norm (as in my comment above) based on this inner product. Is the proposed inequality correct? Of course. – 2012-04-02
-
1@William: that is not how I interpret the OP's question, which says "any norm" rather than "any norm induced from an inner product" and makes no mention of orthogonality. – 2012-04-02
-
0I agree. Please see my comment above. – 2012-04-02
To expand on (what I think is) Qiaochu's answer, for any norm $|| \cdot ||$, and injective linear transformation $A$, we can define a new norm
$$||Ax||$$
I think what Qiaochu is suggesting is taking the standard basis $e_{1}$ and $e_{2}$ for $\mathbb{R}^2$ with the Euclidean norm. Let $A$ be the injective transformation
$$A(e_{1}) = e_{1}$$ $$A(e_{2}) = e_{1} + 0.0001e_{2}$$
Then, considering the norm $||Ax||$, we see that the inequality does not hold for $e_{1} - e_{2}$.