7
$\begingroup$

I need to find $x$ that minimizes the cost function $\|y-Ax\|_p$ when $p$ is close to $0$, subject to the constraint $\|x\|_2=1$ where $x$ and $y$ are vectors in $\mathbb{R}^n$ and $A$ is an $n\times n$ matrix of full rank such that $A^TA$ is diagonal with strictly positive entries and $\operatorname{Tr}(A^TA)=N$. The matrix $A$ and the vector $y$ are both known. The elements of $A$ are in $\mathbb{R}$. Do note that $A$ itself is not diagonal. Specifically (if it helps), $A=\begin{pmatrix} B& C\\ -C& B \end{pmatrix}$ where $B$ and $C$ are of size $n/2 \times n/2$ and are both symmetric matrices.

The problem is non-convex. Any thoughts on solving the expression with $A$ as an identity or diagonal matrix are more than welcome as well

The notation $\|z\|_p$ highlights the $L_p$ norm and is $\|z\|_p=\sum|z_i|^p$ where $z_i$ are the elements of $z$. For $p<1$, $\|\cdot\|_p$ is not a norm in the actual definition of the word as it defies the triangle inequality.

2 Answers 2

1

I am not sure if we can comment about the limiting behaviour. But we can certainly say something about what happens at the limit $p=0$.

At $p=0$, you are trying to minimizing the number of non-zero entries in the vector $e=y-Ax$. Thus, the objective function takes the values in the set $\{0,1,2,\dots,n\}$. The ideal minimum is zero, i.e. when $y=Ax$ and the maximum is $n$ which happens when none of the entries of $e$ are zeros, which means $y_i\neq [Ax]_i,~\forall i$ ($i^{th}$entry). I will try to approach it in a case by case manner.

Consider the case $A=I$. At the limit $p=0$, You are trying to minimize the $l_0$ norm (i.e. number of non-zero entries) in the difference vector $e=y-x$. But you also have the additional constraint that $||x||^2_2=1$. If $y$ was unit norm, then $x=y$ is the optimal solution. If it is not, then you have to zero out the entries of $y$ as much as possible, as your objective is to decrease the number of non-zero entries. The obvious solution (though a rigorous proof escapes me) is to zero out all smallest entries (in absolute sense) of $y$ such that their squared sum is less than or equal to 1. So, you zero out maximum number of non-zero entries.

Consider the case $A=D$ where $D$ is some diagonal matrix. Define $z=Dx$. So you have to zero out the entries of the vector $e=y-z$. Let us assume all diagonal entries of $D$ are non-zero. Now comes an important observation. For any vector $r$, the $l_o$ norm of $r$ is same as $Dr$. So $||y-Dx||_0=||D(D^{-1}y-x)||_0=||D^{-1}y-x||_0$. $D^{-1}y$ is a known vector, and now you follow the previous method for the identity matrix case.

No Need to Read after this, it is still in development :). I just scribbled my thoughts here.

Consider the case $A$ is such that $A^TA$ is diagonal. Then make the observation that $A$ should be of the form $A=UD$ where $U$ is some orthognal matrix and $D$ is some diagonal matrix. Define $v=U^Ty-Dx$ Observe that $||y-UDx||_0=||U(U^Ty-Dx)||_0=||Uv||_0$. Now when is this minimized (if one forgets about the constraints). One possibility is $Uv=0$, so you get the minimum as 0. This is not possible because $U$ is orthogonal. What is the next best answer, can ||Uv|| be such that only one of the entries of is non-zero?. Now one intepretation of $U$ is that it is an orthogonal matrix as well as a rotation matrix and also it preserves the norm.

  • 0
    I didn't get your comment. But I think it has been observed in literature that as $p$ tends to zero, more and more sparser solutions are promoted.2012-12-19
0

the paper of rick chartrand(2012), "nonconvex splitting for regularized low-rank + sparse decomposition", seems for me somehow related to your problem. Some thoughts/guesses: Even if your problem is non-convex, it might be solvable iteratively by a sequence of convex problems via splitting methods (as described e.g. in 'Proximal Splitting Methods in Signal Processing' by combettes). Also you might prefer to transform the hard constraint '||x|| = 1' into a penalty term '+ lambda * (||x|| - 1)^2'.

  • 0
    Thank you Richard, i will look into it!2012-12-24