3
$\begingroup$

I want to efficiently solve the following optimization problem: \begin{align} \min &\quad \left\|\mathbf{x}-\mathbf{x}_0\right\|_2^2 + \lambda\left\|\mathbf{x}\right\|_1\\ \text{Subject to}& \quad A\mathbf{x} \leq c, \end{align}

where $\mathbf{x}\in \mathbb{R}^{n\times 1}$, for big values of $n$. I tried coordinate descend, but it doesn't work. I also don't want to formulate it as QP and use interior-point methods because they are slow. Any idea?

  • 0
    @Taha, When you say efficiently, what do you mean? I can write a MATLAB code to solve it yet on one step I will need MATLAB's [`quadprog()`](https://www.mathworks.com/help/optim/ug/quadprog.html). Is that OK?2017-08-23

3 Answers 3

2

"The Constrained LASSO" by Gareth M James, Courtney Paulson, and Paat Rusmevichientong which is under review and linked below

http://www-bcf.usc.edu/~gareth/research/CLassoFinal.pdf

seems to have solved this specific problem.

1

Why not using augmented methods say like ADMM? The linear inequalities can be appended to your objective as an indicator function. In the subproblem, you'll only have to project to the column space of A.

You could also introduce slack variables and change your inequality to an equality and then append your constraint into your objective as a soft one where later you may use the penalty method.

  • 0
    Yep. It seems ADMM would solve this. But I think you will left with Non Negative Least Squares problem.2017-08-23
0

The enclosed paper developed a constrained LASSO for portfolio optimisation:

Sparse and stable Markowitz portfolios (2009). Proceedings of the National Academy of Science, Vol. 106, No. 30, pages 12267-12272.

http://www.pnas.org/content/106/30/12267.full