1
$\begingroup$

what is the solution for max y$^T$ $\Sigma$ y in terms of maximum eigen value and the associated eigen vectors ?

  • 0
    @Yuval: I think that you should consider changing your comment to an answer so that this isn't an 'unanswered' anymore. It's sufficient, I think. I suppose you could suggest a proof as well.2011-06-02

2 Answers 2

3

Replace $\Sigma$ by (\Sigma + \Sigma')/2 to make it symmetric. You're asking for an unconstrained optimization of a quadratic form. If the quadratic form is negative semidefinite then the answer is $0$. Otherwise the answer is $\infty$.

See Sylvester's law of inertia.

3

$\text{max}(\textbf{y}^T\Sigma \textbf{y})$ is of course either $\infty$ or 0. If all the eigenvalues $\lambda_1, \lambda_2, ...$ are nonnegative ($\lambda_i \geq 0$) then we say it is at least "positive semidefinite" and the quadratic form will grow without bound. If all the eigenvalues are nonpositive, the maximum will be 0.

But, most of these sort of optimization problems come with constraints. if we constrain ourselves to unit vectors $\{\textbf{y}:\|\textbf{y}\|=1\}$ we can find a local maximum. We can also use a change of variables to find local maxima on some ellipsoidal regions, etc. This local maximum will occur at a unit eigenvector corresponding to the largest eigenvalue $\lambda_1$ (largest in increasing order, as in $\lambda_1 > \lambda_2 \cdots,$ not largest in magnitude). At this unit eigenvector and under the given constraints, $\textbf{y}^T\Sigma \textbf{y}$ is maximized and takes value $\lambda_1.$

This kind of trickery is great when you don't want to do an ugly calculus optimization or use numerical means.

edit: this all assumes $\Sigma$ is symmetric, if not replace it by $\frac{1}{2}(\Sigma + \Sigma^T)$ as Yuval has suggested.