Let $u_1,...,u_k$ be unit vectors in $\mathbb R^n$, with $k \geq n$, such that there exist scalars $c_1,...,c_k$ so that, for all $x \in \mathbb R^n$, $$\sum\limits_i c_i \langle x , u_i \rangle^2 = |x|^2,$$ or equivalently $$\sum\limits_i c_i \langle x , u_i \rangle u_i = x$$ for all $x \in \mathbb R^n$. This is the same as saying that the vectors $\sqrt{c_i}u_i$ form a Parseval frame. In these notes by Ball, it is asserted that this implies that if $T$ is a linear map of determinant $1$, then for at least one $i$, $|T u_i| \geq 1$. Is there an easy way to see this?
A linear map of determinant $1$ cannot decrease the length of all of the vectors in a Parseval frame
-
2The idea is the following: The determinant is the product of the eigenvalues. Since the determinant is $1$, at least one of the eigenvalues is greater than $1$. Let $u$ be the corresponding eigenvector. Write $u$ as a sum of the $u_i$'s and see what happens if they all shrink. – 2017-02-26
-
1I was never able to figure this out. In particular, I do not understand the above comment. It seems to me that, if this strategy were to work, we ought to be able to prove this if we relax the requirement that $det(T)=1$ to $T$ merely having an eigenvalue $\lambda$ with $|\lambda| \geq 1$. But this is clearly false, as seen by taking, e.g., $n=2$, $c_1 = c_2 = 1$, $u_1 = {\sqrt{2} \over 2} (1,1)$, $u_2 = {\sqrt{2} \over 2} (1,-1)$, and $T : (x,y) \mapsto (x,0)$. – 2017-09-30
-
0Where in the notes is this assertion? – 2017-09-30
-
0The first full paragraph on page 16. – 2017-09-30
-
0Your example has determinant $0$, not $1$. The corresponding matrix is $\begin{bmatrix}1&0\\0&0\end{bmatrix}$. – 2017-09-30
-
0Observe also that the scalars are positive in the notes. – 2017-09-30
-
1Yes I agree that the determinant is $0$. What I don't understand about your comment is that it seems to imply that all we need is an eigenvalue $\lambda$ with $|\lambda| \geq 1$, but perhaps I am misunderstanding. – 2017-09-30
1 Answers
The statement $\sum_ic_i\langle x,u_i\rangle u_i=x$ for all $x$ can be expressed in matrix form as $$ \sum_ic_i u_iu_i^t=I_n $$ with $u^t$ representing the transpose of $u$ and $I_n$ is the identity matrix. As $uu^t$ has trace $\lVert u\rVert^2$, we can take the trace of the above expression, $$ \sum_ic_i\lVert u_i\rVert^2=n.\qquad{\rm(1)} $$ For an $n\times n$ matrix $T$, $$ \sum_ic_i(Tu_i)(Tu_i)^t=TT^t $$ and, taking the trace again, $$ \sum_ic_i\lVert Tu_i\rVert^2={\rm tr}(TT^t). $$ The symmetric matrix $TT^t$ has nonnegative eigenvalues summing to its trace and whose product is ${\rm det}(T)^2$. By the AM-GM inequality, $$ \sum_ic_i\lVert Tu_i\rVert^2\ge n\,\lvert{\rm det}(T)\rvert^{2/n}.\qquad{\rm(2)} $$ It is assumed that the $c_i$ are nonnegative so, if $T$ has determinant $1$, then it cannot strictly reduce the size of each $u_i$, since (2) would then contradict (1).