1
$\begingroup$

Let $X$ be a square symmetric positive definite matrix.

Show that, $\forall x,y\in\mathbb{R}^n$:

$(a^TXb)^2 \leq (a^TXa)(b^TXb)$ with equality holding iff $a$ and $b$ are linearly dependent.

I'm struggling with this one! Please help.

For equality:

$a$, $b$ linearly dependent $\Leftrightarrow a = kb$.

$\therefore$ on LHS, $(a^TXb)^2 = (a^TXb)(a^TXb)=(kb^TXb)(kb^TXb)=k^2(b^TXb)(b^TXb)$

and on RHS: $(a^TXa)(b^TXb)=(kb^TX(kb))(b^TXb)=k^2(b^TXb)(b^TXb)$ = LHS.

++++++

Question: for a positive definite matrix, X, is it true that, $a^TXa = ||X||.||a||^2$, where ||X|| is the 2-norm?

If so, then I would like to do the following:

$(a^TXb)^2 = |a.Xb|^2 \leq ||a||^2.||Xb||^2 = ||a||^2.||X||^2.||b||^2 = ||a^TXa||.||b^TXb|| = (a^TXa)(b^TXb)$

with equality iff a and Xb are linearly dependent.

my only problem then would be to connect this somehow to the fact that a and b are linearly dependent..? Can someone please tell me if this is completely wrong?

  • 0
    So... when you *do not really understand* something which is written explicitely for your intention, you just... keep silent? This is my turn to *not really understand*.2012-10-08

2 Answers 2

1

The map $(x,y)\mapsto x^TXy$ is an inner product: bilinearity is obvious, and positive definiteness follows from the assumption on $X$. We can apply Cauchy-Schwarz inequality: if $B(\cdot,\cdot)$ is a positive definite linear form, we have $|B(x,y)|^2\leq B(x,x)B(y,y),$ with equality if and only if $x$ and $y$ are linearly dependent. Indeed, we write $0\leq B(B(x,x)y-B(y,y)x,B(x,x)y-B(y,y)x)$ and we expand. This gives $RHS=B(x,x)^2B(y,y)-B(x,x)B(y,y)B(y,x)-B(y,y)B(x,x)B(x,y)+B(y,y)^2B(x,x),$ hence $RHS=B(x,x)B(y,y)\left(B(x,x)-2B(x,y)+B(y,y)\right).$ If $x$ and $y$ are nonzero, this gives $2B(x,y)\leq B(x,x)+B(y,y)$. Now apply the latest inequality to $\frac 1{\sqrt{B(x,x)}}x$ and $\frac 1{\sqrt{B(y,y)}}y$ instead of $x$ and $y$ respectively.

  • 0
    wow! @DavideGiraudo Can you explain how you came up with such a proof solution? Or even thought take those square roots? I did initially reduce to the last inequality that you utilised but would have never have imagined to take the reciprocal square root terms..2012-10-08
1

One could copy a classical method of proof of Cauchy-Schwarz inequality: consider the function $P:\mathbb R\to\mathbb R$ defined by $P(t)=(a+tb)^TX(a+tb)$. Then $P(t)\geqslant0$ for every $t$ (why?), $P$ is a second degree polynomial (why?), hence its discriminant is nonpositive, that is... (to be completed). Furthermore $P(t)=0$ for some $t$ iff $a$ and $b$ are linearly dependent (why?) hence... (to be completed).

If some steps need more explanations, just yell.

  • 0
    Ah. OK. You know, this thing with the solutions of ax^2+bx+c=0 being, like, minus b plus or minus square root of blablabla.2012-10-08