2
$\begingroup$

Assume all matrices I discuss about are $N \times N$ and the vectors conform with dimensions. Consider the following set of Quadratic inequalities where all the matrices $A_i$ are hermitian.

\begin{align} x^{H}A_{1}x &\geq 0 \\ x^{H}A_{2}x &\geq 0 \end{align}

I need to find at least one non-zero vector $x$ such that these inequalities are satisfied at the same time. In fact, the method need not be constructive. I just need to prove, there exist at least one vector which satisfy this equation. Also each $A_i$ is guaranteed to have at least one negative eigen value and at least one positive eigen value, i.e. they are indefinite matrices. Also $A_1$ and $A_2$ are strictly complex. ( You can find examples for real $A_1$ and $A_2$ for which this won't hold, thanks to user @Marc).

EDIT---- MY ATTEMPT

Consider the optimization problem

\begin{align} \max_{x,t}~~~t \\ subject~to~ x^{H}A_{1}x & \geq t \\ x^{H}A_{2}x & \geq t \end{align}

which is equivalent to

\begin{align} \min_{x,t}~~~-t \\ subject~to~ x^{H}A_{1}x & \geq t \\ x^{H}A_{2}x & \geq t \end{align}

If it is feasible, this problem will tell me (in some sense the best) an $x$ for the problem possed initially. Consider the lagrangian

\begin{align} L(x,t,u)= -t + \lambda_{1}(t-x^{H}A_{1}x)+\lambda_{2}(t-x^{H}A_{2}x) \end{align}

where $u$ is vector containing the lagrangian multipliers $\lambda_1$ and $\lambda_2$. The lagrangian dual function is given by

\begin{align} h(u)= \max_{x,t}~L(x,t,u) \end{align}

Now its known that the lagrangian dual function $h(u)$ for all $u\geq 0$ is a lower bound for the original objective function. Hence if we can find some $u$ (i.e. $\lambda_1$ and $\lambda_2$ such that $h(u)=0$, then the original objective function would be greater than 0 and hence that particular $x$ will be the one that satisfies all the inequalities. Now the differentiating the lagrangian dual function $h(u)$ w.r.to $x$ and $u$, one obtains the following conditions. \begin{align} \lambda_1+\lambda_2 &=1 \\ (\lambda_1A_1+\lambda_2A_2)x &= 0 \end{align}

Hence, I get the condition that for my problem to be solvable, there should exist a convex combination of $A_1$ and $A_2$ such that their null space is non-empty or equivalently they have a zero eigen value. This problem looks like a very fundamental question though not so obvious. Any way, I am going to pose this as a new problem.

  • 0
    a) Instead of "... which satisfies this equation. Note that equality is not necessarily needed", you could just write "which satisfies this inequality". b) You're probably looking for a non-zero vector $x$? c) You probably mean that the matrices are not negative *semidefinite*? Otherwise they could have just a single zero eigenvector and the other form could be negative at that point.2012-10-14
  • 0
    Thanks for pointing out them, though I have mentioned they cannot be negative definite.2012-10-14
  • 0
    Yes, that's why I italicized the word "semidefinite", to emphasize that this isn't what you wrote.2012-10-14
  • 0
    Yes I got you, thanks for the edit. :)2012-10-14
  • 0
    I may be mistaken, but by definition _not negative-definite_ means there's at least one solution to the inequalities.2012-10-14
  • 0
    @EuYu: I think the intention is to satisfy them both simultaneously :-)2012-10-14
  • 0
    @dineshdileep: Your last sentence in parentheses is now wrong; you only know that if the matrices are negative definite. If they're negative semidefinite, a solution *might* not exist. (Note the word emphasized in italics ;-)2012-10-14
  • 0
    Geez, could you please take a bit more care? The question was almost right, just a side comment in parentheses was wrong that you could have just deleted, and now you've changed the whole question and now it makes no sense again. Having at least one negative eigenvalue is *not* equivalent to being indefinite.2012-10-14
  • 0
    By the way, now that someone else has also commented, I won't be automatically notified of your comments. If you want me to read them, you'll have to ping me (using @joriki). (I don't have to ping you because you automatically get notified of comments under your question.)2012-10-14
  • 0
    @joriki Ah, silly of me. Thank you for pointing that out.2012-10-14
  • 0
    @joriki Please look at the reformulated problem.2012-10-14

1 Answers 1

1

$$ A_1=\begin{pmatrix}1&0\\0&-2\end{pmatrix} \quad\text{and}\quad A_2=\begin{pmatrix}-2&0\\0&1\end{pmatrix}. $$

  • 0
    My problem is to prove that for ANY given $A_1$ and $A_2$ which are hermitian indefinite matrices, we can find a non-zero vector $x$ such that the mentioned inequalities are satisfied.2012-10-14
  • 0
    @dineshdileep You could start to search for such $x$ for the matrices given above. You can't (hint: $A_1+A_2=-I$), which should save you work on your "proof".2012-10-14
  • 0
    Oh yes!!, I got it. Thanks a lot. This happened because I didn't include the strict condition that $A_1$ and $A_2$ can't be real. Nevertheless it gives me insight. I know this problem can be connected to something referred to as semi-definite relaxation in convex optimization and it can be solved. So a solution exists.2012-10-14
  • 0
    Well, I can't follow exactly what you are up to. But if you pose a question, be sure to include _each and every hypothesis_ that you are making. For now I don't see what "not real" precisely means (a naive interpretation would seem to not be invariant under base change), and therefore how it is going to help you prove anything. But one can only try to answer a question if it is precisely formulated.2012-10-15
  • 0
    I have a problem of the sort $min~x^{H}Qx$ such that $x^{H}A_{1}x>=0$ and $x^{H}A_{2}x>=0$. This is a wireless communication problem where $A_{1}$ and $A_{2}$ denote functions of complex communication channels, and $Q$ is a positive definite covariance matrix of transmitted symbols. So first I need to prove the feasibility of the problem, and hence the above question. I apologize for the incomplete question. I have a engineering back ground but now I am learning Linear algebra.2012-10-15