Consider the equation $f(x) = cx$. It can be seen that there is some $x$ such that $f(x)<0$ iff $c\not=0$.
If we write this system of one equation as a matrix, it's just the matrix $M=(c)$. So in this case, there is a negative assignment to $f$ iff $det(M)\not=0$.
I read something which seemed to indicate that this is true more generally: If you have $n$ equations over $n$ variables, there will be an assignment such that all $f$s are $<0$ iff $det(M)\not=0$.
I'm having trouble proving this even in the 2x2 case, much less in general.
- Is there a name for this theorem? (If it's true)
- Any hints on how to [dis]prove it?
EDIT: The paper I'm looking at is Quantum probabilities as Bayesian probabilities. On page 2, equation (2), they say "The bookie can choose values xA , xB , and xC that lead to R < 0 in all three cases unless [$\det(M)=0$]". Maybe I have misinterpreted their argument, but it seems like they're saying this is a general property of the determinant.