9
$\begingroup$

Supposing $V$ is a finite dimensional vector space (over $\mathbb{R}$) of dimension $n$, and $A,B$ are symmetric positive definite linear mappings from $V$ to $V$, how can I show that in any orthonormal basis $\mathrm{tr}(AB) \geq 0$?

I noticed that since they are symmetric we have that $$\mathrm{tr}(AB) = \sum_{i=1}^n\sum_{j=1}^nA_{ij}B_{ji} = \sum_{i=1}^n\sum_{j=1}^nA_{ij}B_{ij}$$ which is the sum of the elements of the element-wise product of $A,B$. I don't know if this is helpful.

  • 0
    I think you may as well assume $V = \mathbb{R}^n$ and that $A$ and $B$ are matrices. Symmetric positive-definite linear maps don't make much sense except in the presence of a chosen isomorphism $V \cong V^*$.2012-02-25
  • 1
    If this is homework, can you assume the spectral theorem? That is, can you assume the nice properties that hold about matrices' eigenvalues when they are symmetric & positive definite?2012-02-25
  • 0
    Yes we may assume $V=\mathbb{R}^n$ and yes you may use the spectral theorem. But remember the result must hold for all orthonormal bases, not just the one where $A$ is diagonal.\2012-02-25

3 Answers 3

0

Here's another derivation (7 years later):

Let $A,B\succeq0$. Then the eigendecomposition of symmetric $B$ gives $B=\sum_{i=1}^n \lambda_i v_i v_i^T$. Therefore,

$$\begin{align} \operatorname{Tr}[AB]&=\operatorname{Tr}[A\sum_{i=1}^n \lambda_i v_i v_i^T]\\ &=\sum_{i=1}^n \lambda_i \operatorname{Tr}[Av_i v_i^T]\\ &=\sum_{i=1}^n \underbrace{\lambda_i}_{\geq0} \underbrace{v_i^TAv_i}_{\geq0} \\ &\geq 0 \end{align}$$

where the last equality is from the cyclic property of the trace.

Feel free to ask for any clarifications neeeded.

Edit: Here's the explanation of the eigendecomposition.

In matrix form, the eigen-equation is: $BV=V\Lambda$, where $V$ is a matrix whose columns are the eigenvectors $\{v_i\}$ of $B$, and where $\Lambda=\operatorname{diag}(\lambda_1,...,\lambda_n)$ is a diagonal matrix with the eigenvalues of $B$ along the diagonal. Because $B$ is symmetric, these $V$ matrices are orthogonal, meaning their columns are orthonormal, so $VV^T=V^TV=I_n$. We can then re-write the matrix equation: $BVV^T=B=V\Lambda V^T$.

$$\begin{align} \rightarrow B &= \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} \begin{bmatrix} \lambda_1 & \cdots & 0\\ \vdots & \ddots & \vdots\\ 0 & \cdots & \lambda_n \end{bmatrix} \begin{bmatrix} v_1^T\\ \vdots\\ v_n^T \end{bmatrix}\\ &= \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} \begin{bmatrix} \lambda_1v_1^T\\ \vdots\\ \lambda_nv_n^T \end{bmatrix}\\ &= \sum_{i=1}^n \lambda_i v_i v_i^T \end{align}$$

where this is a sum of outer products of the eigenvectors. Note that I wrote my matrices above as vector with elements that are vectors. This shorthand is valid and very convenient, but feel free to write it out and check me!

13

As others have remarked, you might as well suppose that $A$ and $B$ are positive semidefinite matrices. We may write $A = X^{t}X$ and $B = Y^{t}Y$ where $X$ and $Y$ are $n \times n$ real matrices. Then ${\rm tr}(AB) = {\rm tr}(X^{t}X Y^{t}Y)$ = ${\rm tr}((YX^{t})(XY^{t})).$ The latter matrix has the form ${\rm tr}(UU^{t})$ for a real $n \times n$ matrix $U$, and such a trace is always non-negative.

8

Since this may be homework, I will only give hints.

  1. Without loss of generality you may assume that $V=R^n$.

  2. Trace is independent of the basis you use. Thus it suffices to show this in the basis where $A$ is diagonal.

  3. A positive semi-definite matrix has nonnegative diagonal. Why?

  4. Putting 1-3 together, one needs to show that the $tr(AB)\geq 0$ where $A$ is a nonnegative diagonal matrix and $B$ has nonnegative diagonal.