5
$\begingroup$

Let $A$ be a real symmetric positive-semidefinite matrix and suppose that $c>0$ is a sufficiently small number. I wonder if it is possible to solve the non-convex optimization $$\arg\max_u\ u^\mathrm{T}Au\\ \mathrm{subject\ to\ }\left\Vert u\right\Vert_2\leq 1\\ \quad\quad\quad\quad\left\Vert u\right\Vert_1\leq c,$$ efficiently.

For solving the optimization, I couldn't get farther than writing KKT conditions which do not help much in specifying the multipliers.

Given that without the $\ell_1$-norm constraint (i.e. $c\to\infty$), the problem reduces to finding the principal eigenvector of $A$ that can be solved efficiently (e.g., using power iteration method), we can think of the solution to the optimization above as "psuedo eigenvector".

  • 0
    Dont know if this helps, but this sort of combination of L^2 and L^1 norm is known in statistics as the lasso, so you can google for that and see if some of the algorithms developed for the lasso can be applied.2012-10-05
  • 0
    @kjetilbhalvorsen: I'm aware of the LASSO and the related subjects in statistics, but I don't think they really apply here as randomness is central in their arguments, while my problem is completely deterministic. Namely, matrix $A$ is not drawn from some random ensemble of matrices.2012-10-05
  • 0
    ... but their algorithms could still apply ... the algorithms in itself do not care about randomnes, they only see the data, not the model.2012-10-05
  • 0
    @kjetilbhalvorsen: It's true that their algorithm may apply, but we cannot guarantee it actually produces an accurate solution to the original problem. The randomness is what allows them to provide guarantees of accuracy. Regardless of this issue, the problem I have is a **maximization** of a convex quadratic, which is a non-convex optimization problem. The LASSO is a convex optimization formulation.2012-10-06
  • 0
    If $c$ is "sufficiently small" (i.e. smaller than $1$), then you can just get rid of the $\lVert u\rVert_2\le1$ constraint, can't you? Then the problem reduces to quadratic programming, and you can maybe even just iterate over the (explicitly known) facets of the feasible polytope.2013-01-27
  • 0
    @ℝⁿ: The problem is that I'm interested in problems where $c>1$. In fact, the $\ell_1$-norm constraint is a convex relaxation of $\ell_0$ constraint that induces sparsity. So for $\left\Vert u\right\Vert _0\leq s$ we should choose $c=\sqrt{s}\geq 1$ in the $\ell_1$-constrained form.2013-01-29

1 Answers 1