2
$\begingroup$

Let $\mathbf{v}_{1},\mathbf{v}_{2},\cdots,\mathbf{v}_{m}$ be $m$ vectors in $n$-dimensional space. Their Gram determinant is defined by:

$\Gamma=\left|\begin{array}{cccc} \mathbf{v}_{1}^{2} & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{2}\right) & \cdots & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{m}\right)\\ \left(\mathbf{v}_{2}\cdot\mathbf{v}_{1}\right) & \mathbf{v}_{2}^{2} & \cdots & \left(\mathbf{v}_{2}\cdot\mathbf{v}_{m}\right)\\ \cdots & \cdots & \cdots & \cdots\\ \left(\mathbf{v}_{m}\cdot\mathbf{v}_{1}\right) & \left(\mathbf{v}_{m}\cdot\mathbf{v}_{2}\right) & \cdots & \mathbf{v}_{m}^{2} \end{array}\right|$

If $v_{ij}$ is $j$th component of $\mathbf{v}_{i}$, prove that

$\Gamma=\sum\left|\begin{array}{cccc} v_{1s_{1}} & v_{1s_{2}} & \cdots & v_{1s_{m}}\\ v_{2s_{1}} & v_{2s_{2}} & \cdots & v_{2s_{m}}\\ \cdots & \cdots & \cdots & \cdots\\ v_{ms_{1}} & v_{ms_{2}} & \cdots & v_{ms_{m}} \end{array}\right|^{2}$

where the summation is extended over all integers $s_{1},s_{2},\cdots,s_{m}$ from 1 to $n$ with $s_{1}.

  • 0
    Perhaps I should have said it explicitly in the question. To avoid future confussion: I assume that the basis vectors are orthonormal.2011-12-21

2 Answers 2

6

Here's how to begin:

Entry $(i,j)$ equals $\mathbf{v}_{i}\cdot\mathbf{v}_{j} = \sum_{s=1}^n v_{is} v_{js}$. Here $s$ is just a dummy variable, so we don't need to call it $s$ in each entry; let's choose to call it $s_j$ in each of the sums in column number $j$. Then, since the determinant is linear in each column separately, we can pull the sums and the factors $v_{j s_{j}}$ outside. Like this for the first column: $ \Gamma=\left|\begin{array}{cccc} \sum_{s_1} v_{1 s_{1}} v_{1 s_{1}} & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{2}\right) & \cdots & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{m}\right)\\ \sum_{s_1} v_{2 s_{1}} v_{1 s_{1}} & \mathbf{v}_{2}^{2} & \cdots & \left(\mathbf{v}_{2}\cdot\mathbf{v}_{m}\right)\\ \cdots & \cdots & \cdots & \cdots\\ \sum_{s_1} v_{m s_{1}} v_{1 s_{1}} & \left(\mathbf{v}_{m}\cdot\mathbf{v}_{2}\right) & \cdots & \mathbf{v}_{m}^{2} \end{array}\right| = \sum_{s_1} v_{1 s_{1}} \left|\begin{array}{cccc} v_{1 s_{1}} & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{2}\right) & \cdots & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{m}\right)\\ v_{2 s_{1}} & \mathbf{v}_{2}^{2} & \cdots & \left(\mathbf{v}_{2}\cdot\mathbf{v}_{m}\right)\\ \cdots & \cdots & \cdots & \cdots\\ v_{m s_{1}} & \left(\mathbf{v}_{m}\cdot\mathbf{v}_{2}\right) & \cdots & \mathbf{v}_{m}^{2} \end{array}\right|. $ And doing the same for each column: $ \Gamma = \sum_{s_1} \sum_{s_2} \dots \sum_{s_m} v_{1 s_{1}} v_{2 s_{2}} \dots v_{m s_{m}} \left|\begin{array}{cccc} v_{1s_{1}} & v_{1s_{2}} & \cdots & v_{1s_{m}}\\ v_{2s_{1}} & v_{2s_{2}} & \cdots & v_{2s_{m}}\\ \cdots & \cdots & \cdots & \cdots\\ v_{ms_{1}} & v_{ms_{2}} & \cdots & v_{ms_{m}} \end{array}\right|. $ If two indices $s_j$ coincide, this last determinant has two equal columns and therefore vanishes, so we need only sum over index sets $(s_1,\dots,s_n)$ with all numbers distinct. Can you see how to continue from here?

  • 0
    Yes, I get it. Thanks.2011-12-20
6

The most elegant solution is probably to extend the given scalar product on your $m$-dimensional vector space $V$ to the exterior product $\Lambda ^m(V)$.
The recipe is that any orthonormal basis $e_1,...,e_n$ of $V$ yields an orthonormal basis $e_{s_1 }\wedge ...\wedge e_{s_m }$ $(s_{1} of $\Lambda ^m(V) $ .

As a consequence, if you write $v_i=\Sigma v_{ij}e_j$ you get $v_1\wedge...\wedge v_m=\Sigma c_{s_1...s_m} e_{s_1 }\wedge ...\wedge e_{s_m } \quad (+)$ with

$c_{s_1...s_m}= \left|\begin{array}{cccc} v_{1s_{1}} & v_{1s_{2}} & \cdots & v_{1s_{m}}\\ v_{2s_{1}} & v_{2s_{2}} & \cdots & v_{2s_{m}}\\ \cdots & \cdots & \cdots & \cdots\\ v_{ms_{1}} & v_{ms_{2}} & \cdots & v_{ms_{m}} \end{array}\right| \quad \quad (*)$

Then taking the square of the lengths of both sides of this equality $(+) $ you obtain
$||v_1\wedge...\wedge v_m||^2=\Sigma |c_{s_1...s_m}|^2\quad (**)$

Now remember that in a $m$- dimensional vector space spanned by the vectors $v_1,..., v_m$ (assumed independent: everything is trivial if these vectors are dependent ) we have the equality for squared length $||v_1\wedge...\wedge v_m||^2=\Gamma =\left|\begin{array}{cccc} \mathbf{v}_{1}^{2} & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{2}\right) & \cdots & \left(\mathbf{v}_{1}\cdot\mathbf{v}_{m}\right)\\ \left(\mathbf{v}_{2}\cdot\mathbf{v}_{1}\right) & \mathbf{v}_{2}^{2} & \cdots & \left(\mathbf{v}_{2}\cdot\mathbf{v}_{m}\right)\\ \cdots & \cdots & \cdots & \cdots\\ \left(\mathbf{v}_{m}\cdot\mathbf{v}_{1}\right) & \left(\mathbf{v}_{m}\cdot\mathbf{v}_{2}\right) & \cdots & \mathbf{v}_{m}^{2} \end{array}\right|\quad \quad (***)$ ( determinant of Gram matrix).

Replacing both sides of $(**)$ by their values given in $(*)$ and $(***)$ proves the required formula.

Edit
1) I forgot to mention that formula $(*)$ can be thought of as a generalization of Pythagoras' theorem, with $||v_1\wedge...\wedge v_m||^2$ playing the role of the squared length of the hypotenuse $v_1\wedge...\wedge v_m$.

2) I feel that the euclidean structure inherited by $\Lambda ^mV \;\;$ from a euclidean structure on $V$ is not as well-known as it deserves.
One of the rare treatments of this theme in the textbook literature is given in MacLane-Birkhoff's Algebra (cf. especially Theorem 18, page 557).

  • 1
    @L.T. You should use $(+)$ and bilinearity to calculate $(v_1\wedge...\wedge v_m|v_1\wedge...\wedge v_m)$. We know that $(e_{s_1 }\wedge ...\wedge e_{s_m }|e_{t_1 }\wedge ...\wedge e_{t_m })$ is the determinant of $(g_{s_i t_j})$. The calculation then reduces to calculating the $ c_{s_1...s_m}$ ' s . I don't know the formula but it has no longer anything to do with exterior algebra. The problem is reduced to : in a euclidean space , calculate the coefficients of a vector in a fixed basis , given the mutual scalar products of all pairs of basis vectors.2011-12-20