2
$\begingroup$

Assume all matrices I discuss about are $N \times N$ and the vectors conform with dimensions. Consider the following set of Quadratic inequalities where all the matrices $A_i$ are hermitian.

\begin{align} x^{H}A_{1}x &\geq 0 \\ x^{H}A_{2}x &\geq 0 \end{align}

I need to find at least one non-zero vector $x$ such that these inequalities are satisfied at the same time. In fact, the method need not be constructive. I just need to prove, there exist at least one vector which satisfy this equation. Also each $A_i$ is guaranteed to have at least one negative eigen value and at least one positive eigen value, i.e. they are indefinite matrices. Also $A_1$ and $A_2$ are strictly complex. ( You can find examples for real $A_1$ and $A_2$ for which this won't hold, thanks to user @Marc).

EDIT---- MY ATTEMPT

Consider the optimization problem

\begin{align} \max_{x,t}~~~t \\ subject~to~ x^{H}A_{1}x & \geq t \\ x^{H}A_{2}x & \geq t \end{align}

which is equivalent to

\begin{align} \min_{x,t}~~~-t \\ subject~to~ x^{H}A_{1}x & \geq t \\ x^{H}A_{2}x & \geq t \end{align}

If it is feasible, this problem will tell me (in some sense the best) an $x$ for the problem possed initially. Consider the lagrangian

\begin{align} L(x,t,u)= -t + \lambda_{1}(t-x^{H}A_{1}x)+\lambda_{2}(t-x^{H}A_{2}x) \end{align}

where $u$ is vector containing the lagrangian multipliers $\lambda_1$ and $\lambda_2$. The lagrangian dual function is given by

\begin{align} h(u)= \max_{x,t}~L(x,t,u) \end{align}

Now its known that the lagrangian dual function $h(u)$ for all $u\geq 0$ is a lower bound for the original objective function. Hence if we can find some $u$ (i.e. $\lambda_1$ and $\lambda_2$ such that $h(u)=0$, then the original objective function would be greater than 0 and hence that particular $x$ will be the one that satisfies all the inequalities. Now the differentiating the lagrangian dual function $h(u)$ w.r.to $x$ and $u$, one obtains the following conditions. \begin{align} \lambda_1+\lambda_2 &=1 \\ (\lambda_1A_1+\lambda_2A_2)x &= 0 \end{align}

Hence, I get the condition that for my problem to be solvable, there should exist a convex combination of $A_1$ and $A_2$ such that their null space is non-empty or equivalently they have a zero eigen value. This problem looks like a very fundamental question though not so obvious. Any way, I am going to pose this as a new problem.

  • 0
    @joriki Please look at the reformulated problem.2012-10-14

1 Answers 1

1

$ A_1=\begin{pmatrix}1&0\\0&-2\end{pmatrix} \quad\text{and}\quad A_2=\begin{pmatrix}-2&0\\0&1\end{pmatrix}. $

  • 0
    I have a problem of the sort $min~x^{H}Qx$ such that x^{H}A_{1}x>=0 and x^{H}A_{2}x>=0. This is a wireless communication problem where $A_{1}$ and $A_{2}$ denote functions of complex communication channels, and $Q$ is a positive definite covariance matrix of transmitted symbols. So first I need to prove the feasibility of the problem, and hence the above question. I apologize for the incomplete question. I have a engineering back ground but now I am learning Linear algebra.2012-10-15