0
$\begingroup$

I'm looking for a general method to find at least one solution to the system below

$a_0 = a_1x_1 + \cdots + a_nx_n$

$b_0 = b_1x_1 + \cdots + b_nx_n$

$c_0 = c_1x_1 + \cdots + c_nx_n$

$d_0 = d_1x_1 + \cdots + d_nx_n$

$x_1 + \cdots + x_n = 1$ $0\le x_1, \dots , x_n \le 1$

where $a_i, b_i, c_i, d_i$ are known real numbers, $x_i$ are unknown and $n > 20$.

Thank you in advance.

  • 0
    The most general method would be to use the [Moore-Penrose inverse](http://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_pseudoinverse) (which can be computed with a singluar value decomposition; see linked article). Say $A$ is the $5\times n$ matrix formed by the row vectors $\vec{a},\vec{b},\vec{c},\vec{d},\vec{1}$. The least-squares solution is then $\vec{x}=A^+ (a_0,b_0,c_0,d_0,1)^T$, and you'd have to check the inequalities $\vec{x}\in[0,1]^n$ afterwards. I'm not sure if this reasoning is fleshed-out enough for OP to be posted as an answer as-is2011-08-10

1 Answers 1

1

You have of course an underdetermined system; you however did not mention if you'd like a least squares solution or a solution minimizing some other norm. anon's suggestion of using SVD is a good one, but you might be able to get away with using the cheaper QR decomposition if the underlying matrix is "well-conditioned" enough, and reserve SVD for the cases where QR doesn't cut it.

Alternatively, if you are seeking solutions that are optimal in either the $\|\cdot\|_1$ (Manhattan) or $\|\cdot\|_\infty$ (Chebyshev) norms, there are methods based on linear programming for seeking these solutions. (Now that I think about it, the constraints you placed on your variables does make this look like an LP problem.) You might want to look at this book for instance.