Given a matrix $A$ I want to find a vector $\vec{x}$ such that every element of $A\vec{x}$ is strictly positive. Also, the columns of $A$ do not span the full space, so if I were to just naively pick some $\vec{y}$ with all positive entries, I could not in general find a solution for $A\vec{x} = \vec{y}$. Is there a method guaranteed to find a feasible point if one exists? Bonus points if it's a matrix-free method, so I only have to evaluate matrix-vector products, no pseudoinverses or factorizations. I suspect that this can be set up as a convex program, but I'm not sure how.
Find a vector such that its matrix product is positive in every element
2
$\begingroup$
linear-algebra
convex-analysis
convex-optimization
-
0Where does $A$ come from? Does it have any particular structure? – 2012-07-25
-
0It's a system identification problem. Really $\vec{x}$ is a matrix $X$, which I right-multiply by another matrix $U$, and then apply the adjoint of the linear operator that constructs a block-Hankel matrix. If I can find $\vec{x}$ such that $XU$ is positive, then it should work. – 2012-07-25
-
2Maybe try solving the linear program $\min_{\alpha,x} \{ \alpha | - \sum_{j=1}^n A_{i,j} x_j \leq \alpha , \ i=1,\cdots, n \}$? If the minimum is $<0$ then you have an answer, otherwise you know no such $x$ exists. – 2012-07-25
-
0Equivalently, you want a vector $x$ that has positive dot product with every *row* of the matrix. So it's a question of whether the intersection of open halfspaces determined by the row vectors is nonempty. – 2012-07-26