5
$\begingroup$

I've written a C# solver for linear least squares problems with inequality constraints. That is, given $A$, $b$, $G$, $h$

$\min\|Ax-b\|^2\text{ s.t. }Gx\ge h$

I have a few hand crafted test problems that my solver gives the correct answer for, and now I'd like to throw a gauntlet of randomly generated problems of various ranks at it to make sure there aren't any edge cases I'm missing.

So what I need is a way to determine that a given $b$ vector calculated satisfies the constraints $Gx \ge h$ (which is easy to check for) and that the solution vector can't be improved by perturbing it in a given non-constrained direction. The second part is what I'm at a loss for.

  • 0
    Note to myself - Add full solution.2017-09-04

2 Answers 2

4

As written above, the nice thing about the KKT conditions is that they allow you to check any solution as it was a black box without a reference solver just by validating its $ \lambda $ on the KKT conditions.

Yet while there are many solvers for this I'm adding another approach - The Projected Gradient Descent.

The Problem Statement

$ \begin{align*} \arg \min_{x} & \quad & \frac{1}{2} \left\| A x - b \right\|_{2}^{2} \\ \text{subject to} & \quad & C x \leq d \end{align*} $

The Solution

Now, if we had a closed form projection into the set of the Linear Inequality (Convex Polytop / Convex Polyhedron), which is a Linear Inequality Constraints Least Least Square problem by itself, using the Projected Gradient Descent was easy:

  1. Gradient Descent Step.
  2. Project Solution onto the Inequality Set.
  3. Unless converged go to (1).

Yet, another way is to use Alternating Projections.
Since the set:

$ \mathcal{S} = \left\{ x \mid C x \leq d \right\} $

Can be decomposed into:

$ \mathcal{S} = \cap_{i = 1}^{l} {\mathcal{s}}_{i} = \cap_{i = 1}^{l} \left\{ x \mid {c}_{i}^{T} x \leq {d}_{i} \right\} $

Where $ {c}_{i} $ is the $ i $ -th row of $ C $.

Projection onto Half Space

In the above, each $ \mathcal{s}_{i} $ is an half space which the projection onto is known:

$ \begin{align*} \arg \min_{x} & \quad & \frac{1}{2} \left\| x - y \right\|_{2}^{2} \\ \text{subject to} & \quad & {c}^{T} x \leq d \end{align*} $

The solution is given by:

$ \hat{x} = y - \lambda c, \quad \lambda = \max \left\{ \frac{ {x}^{T} y - d }{ \left\| c \right\|_{2}^{2} }, 0 \right\} $

Now the solution becomes:

  1. Gradient Descent Step.
  2. Project Solution onto the Inequality Set:

    • Project onto the 1st Half Space.
    • Project onto the 2nd Half Space.
    • ...
    • Project onto the k-th Half Space.
  3. Unless converged go to (1).

The code for the Gradient Descent is given by:

mAA = mA.' * mA; vAb = mA.' * vB;  vX          = zeros([numCols, 1]); vObjVal(1)  = hObjFun(vX);  for ii = 2:numIterations      vG = (mAA * vX) - vAb; %

The Projection (Alternating Projection):

function [ vX ] = ProjectOntoLinearInequality( vY, mC, vD, stopThr )  numConst    = size(mC, 1); vCNorm      = sum(mC .^ 2, 2);  vX      = vY; vRes    = (mC * vX) - vD; maxRes  = max(vRes);  while(maxRes > stopThr)     for ii = 1:numConst         paramLambda = max(((mC(ii, :) * vX) - vD(ii)) / vCNorm(ii), 0);         vX = vX - (paramLambda * mC(ii, :).');     end      vRes = (mC * vX) - vD;     maxRes = max(vRes);  end   end 

The result:

enter image description here

The full code (Including validation against CVX) is available on my StackExchange Math Q73712 GitHub Repository.

1

You want the KKT conditions of the problem; since it is convex, a given $x$ is a minimizer if and only there exists Lagrange multipliers $\lambda$ such that $\begin{align*}A^TAx - A^T b - G^T\lambda &= 0\\ Gx-h &\geq 0\\ \lambda &\geq 0\\ \lambda^T (Gx-h) &= 0\end{align*}.$

I'm not completely sure of the best way of finding a $\lambda$ certifying the above or of proving one doesn't exist; you can start by using the last equation to split the constraints into an active set ($G_ax-h_a=0$, $\lambda_a \geq 0$) and inactive set $(G_i x - h_i > 0, \lambda_i = 0)$, and deleting the inactive constraints from the above equations. If $A^TA$ is invertible you can then directly solve for $\lambda_a$ and check if all entries are nonnegative. If $A^TA$ is singular I'm not sure how best to proceed; maybe another answer will elaborate.

  • 0
    What method are you using to solve the problem? As for testing for optimality, the KKT conditions are the way to go.2011-10-29