Consider the equation $\mathbf{v} = \mathbf{v}_0 + A\mathbf{x}$ where $A$ is an $m\times n$ matrix ($m\gt n$) with entries equal to $-1$, $0$, or $1$ only. Additionally, $1$ and $-1$ only appear once in each row $A$ so the sum of a row is always zero.
$\mathbf{x}_0 = \mathbf{0}$
The problem is to find an $\mathbf{x}$ that minimizes the L-infinity norm of $\mathbf{v}$.
What method should I use? Thanks.