3
$\begingroup$

Let $x_1, ..., x_n$ be linearly independent columns from $\mathbb{R}^n$. Let's define $X_{ij} = x_i x_j^T$. I have to show that $\mathop{\mathrm{rank}}{\{ X_{ij} \}}_{i < j} = \frac{n(n-1)}{2}$. For subsets with common index it is obviously, but in general I don't know what to do.

2 Answers 2

2

We have to show that $\{X_{ij},1\leq i is linearly independent. Let $\{a_{ij},1\leq i real numbers such that $\sum_{1\leq i. Since $\{x_i,1\leq i\leq n\}$ is linearly independent, let $P$ such that for all $i$, $Px_i=e_i$, the $i$-th vector of the canonical basis of $\mathbb R^n$. Multiplying by $P$ at left, and $P^T$ at right, we get \begin{align*} 0&= P\:\sum_{} a_{ij}x_ix_j^T P^T\\ &= \sum a_{ij}Px_i(P\:x_j)^T\\ &=\sum a_{ij}e_ie_j^T\\ &=\sum a_{ij}E_{ij}, \end{align*} where $E_{ij}$ is the matrix whose entry $(i,j)$ is $1$ and the others $0$. Taking for $i fixed the entry $(i,j)$, we get $a_{ij}=0$ for all $1\leq i which shows that the family $\{X_{ij},1\leq i is linearly independent. Since its cardinality is $\frac{n(n-1)}2$, we get the expected result.

In fact, the sum can be taken over an arbitrary subset $S$ of $\left\{1,\ldots,n\right\}^2$, which proves that $\operatorname{rank}\{X_{ij},(i,j)\in S\}=\operatorname{card}(S)$.

2

Your question is a bit confused in that you have a set of $\binom{n}2$ matrices, all of rank $1$, and you're interested in knowing whether the dimension of the linear subspace of matrices they span is $\binom{n}2$, in other words if they are linearly independent. That question is the same as whether the corresponding linear maps $\mathbb R^n\to\mathbb R^n$ are linearly independent. But for that one can use other bases than the canonical basis of $\mathbb R^n$ to express the matrices on; you don't even need to use the same basis at departure and at arrival. Now since $x_1,\ldots,x_n$ is a basis of $\mathbb R^n$, it has a "dual" basis $y_1,\ldots,y_n$ for which the linear maps $x_i^T:\mathbb R^n\to\mathbb R$ are the coordinate functions (so $x_i^Ty_j=\delta_{i,j}$ for all $i,j$). Then using $y_1,\ldots,y_n$ as basis at departure and $x_1,\ldots,x_n$ at arrival, the matrix of the linear map that has $X_{i,j}$ as matrix on the standard basis gets a a matrix $E_{i,j}$ (as in Davide Giraudo's answer): the basis vector $y_j$ is mapped to $x_i$ and all other vectors of the basis $y_1,\ldots,y_n$ are mapped to $\vec0$. In this representation it is obvious that the linear maps $X_{i,j}$ are linearly independent (they even remain independent if one lets $i$ and $j$ range through all $n^2$ possible combinations).

This answer is basically the same as the cited one, but explains where the multiplications by matrices $P$ on the left and $P^T$ on the right come from: they are basis transformations to the $x$'s at arrival respectively to the $y$'s at departure.