Consider two symmetric positive semi-definite matrices $A, B \in \mathbb{R}^{n\times n}$. Suppose that $A$ and $B$ have the same null space $\mathcal{N}\subset \mathbb{R}^n$. Now consider the objective function $$J=\frac{x^TAx}{x^TBx}$$ where $x\in\mathbb{R}^n$ is an arbitrary unit-norm vector with $x^Tx=1$ and $x\notin \mathcal{N}$. Although $x^TAx\ne 0$ since $x\notin \mathcal{N}$, it is possible $x^TAx\rightarrow 0$ when $x$ is very close to $\mathcal{N}$. So neither $x^TAx$ nor $x^TBx$ has lower bound. But under what condition does the objective function $J$ have a positive lower bound? I mean what special properties (e.g., eigenvalues) should $A$ and $B$ have to make $J$ have a positive lower bound?
Lower bound of $J=\frac{x^TAx}{x^TBx}$
1
$\begingroup$
linear-algebra
matrices
optimization
-
0Are $A,B$ symmetric? – 2012-08-13
-
0@copper.hat: yes, they are symmetric positive semi-definite. – 2012-08-13
-
0Looks a bit like a [generalized Rayleigh quotient](http://books.google.com/books?hl=en&id=U6-ubGYgf7QC&pg=PA28)... on the other hand, the link deals with the two matrices being positive *definite*, and I'm drawing a blank on how to adapt things to your case... – 2012-08-13
-
0@J.M.: Thanks for bring this reference to my attention. – 2012-08-13
-
1are you the author of that book referenced (Shi Yu)? – 2012-08-13
-
0I wish I were:) – 2012-08-13
1 Answers
2
I am not exactly sure what you are looking for, as $J(x) \geq 0$ for all $x$ such that $J$ is defined. In fact, both $x^TAx \geq 0$ and $x^TBx \geq 0$, so both of these have lower bounds as well.
However, if $x \notin \mathcal{N}$ you can get a better lower bound:
We have $\mathcal{N} = \ker A = \ker B$. Let $\underline{\lambda}_A$ be the smallest non-zero eigenvalue of $A$. Then if $x = x_1 + x_2$, where $x_1 \in \mathcal{N}$, and $x_2 \in \mathcal{N}^\bot$, then we have $$x^TAx = x_2^T A x_2 \geq \underline{\lambda}_A \| x_2 \|^2, \ \ \ x^TBx = x_2^TBx_2 \leq \|B\| \|x_2\|^2.$$ So, if $x \notin \mathcal{N}$, then $x_2 \neq 0$, which gives the estimate $J(x) = \frac{x_2^T A x_2}{x_2^TBx_2} \geq \frac{\underline{\lambda}_A}{\|B\|} > 0.$
-
0Thanks. That's what I meant. I mean the largest positive lower bound of $J$. Perhaps replacing $\|B\|$ with $\lambda_{max}$ is better. The lower bound then is larger. – 2012-08-13
-
0Well, $\lambda_{\max}(B) = \|B\|$, so not much improvement there. You would need to know more about the structure of $A,B$ to improve on this bound, I think. The bound will only be hit if $x_2$ simultaneously lies along an eigenvector of both $A$ and $B$, and it happens to be $\underline{\lambda}_A$, $\lambda_{\max}(B)$ respectively. – 2012-08-13
-
0I thought $\|B\|$ is the Frobenius norm of $B$. Thanks for the answer and the suggestion! – 2012-08-13