I'm having a serious doubt in the least square regression problem. I guess its got to do with the notation of norm. Is the least square formulation $||b - \mathbf{A}x||^2$ or is it $||b - \mathbf{A}x||_2$. Or are they the same ?
Where is the square in the Least square regression method?
0
$\begingroup$
linear-algebra
regression
norm
2 Answers
0
Usually it is $||b - \mathbf Ax||_2$, but the problem of minimizing $||b - \mathbf Ax||_2$ is the same as minimizing $||b - \mathbf Ax||_2^2$. The notation $||b - \mathbf Ax||$ is short for $||b - \mathbf Ax||_2$ when it is understood that the norm being used is the $2$-norm.
-
0Can you explain or point me to some article which explains why the problem of minimizing $$||b - \mathbf Ax||_2 $$ is the same as minimizing $$ ||b− \mathbf Ax||_2^2 $$ . Will they always result in the same answer ? – 2012-08-31
-
1karthik, do you not see that if you have some function $f$ that takes non-negative real values then minimizing $f$ is the same as minimizing $f^2$? – 2012-08-31
-
0So I can choose either as a minimizing function and end up with the same answer ? – 2012-08-31
-
0The function $x \mapsto x^2$ is a strictly increasing function when $x \ge 0$. That means if $x, y \ge 0$, then $x \le y$ if and only if $x^2 \le y^2$. – 2012-08-31
0
Suppose $v=(a,b,c)$. Then $\|v\|$ and $\|v\|_2$ are both common notations for $\sqrt{a^2+b^2+c^2}$. If you minimize $\|v\|_2$ then you also minimize $\|v\|^2$, and vice versa.