23
$\begingroup$

Given a set of vectors $v_1 \cdots v_n$, the $n\times n$ Gram matrix $G$ is defined as $G_{i,j}=v_i \cdot v_j$

Due to symmetry in the dot product, $G$ is Hermitian.

I'm trying to remember why $|G|=0$ iff the set of vectors are not linearly independent.

4 Answers 4

29

JasonMond's "only if" part is not as general as it should, because s/he assumes that $A$ is square. In the following, I complete the proof that holds whether $A$ is square or not.

Let $G = A^T A$. If the column vectors of $A$ are linearly dependent, there exists a vector $u \neq 0$ such that $ A u = 0. $ It follows that $ 0 = A^T A u = G u. $ Since $u \neq 0$, $G$ is not invertible.

Conversely, if $G$ is not invertible, there exists a vector $v \neq 0$ such that $ G v = 0. $ It follows that $ 0 = v^T G v = v^T A^T A v = (A v)^T A v = \lVert A v\rVert^2 $ and therefore that $ A v = 0. $ Since $v \neq 0$, the column vectors of $A$ are linearly dependent. QED.

  • 0
    Beautiful proof!2017-11-28
9

Let $A$ be the matrix whose columns are the vectors $v_1, v_2, ... v_n$. Then the Gram matrix is $A^T A$, so $\det G = (\det A)^2$.

Edit: Here's an explanation that ignores the dimension of the ambient space. If $\langle \cdot, \cdot \rangle$ denotes the inner product, then the Gram matrix is precisely the matrix describing the inner product

$\langle x_1 v_1 + ... + x_n v_n, y_1 v_1 + ... + y_n v_n \rangle$

on $\mathbb{R}^n$. It's not hard to see that the vectors $v_i$ are linearly independent if and only if the above inner product is positive-definite. But we can write the above as $v^T Gv$ where $v \in \mathbb{R}^n$ and $G$ is the Gram matrix, and then we know that the inner product is positive-definite if and only if $G$ is invertible (since $G$ is invertible if and only if its eigenvalues are all positive).

  • 0
    I found this looking for help with a proof, I want to show det(G)=det(A)$^2$ essentially. but I cannot assume that my linearly independent set of vectors form a basis for the space they come from, they might but the could be the basis for a subspace too. However as$n$doesn't matter (it must be greater than or equal to the number of linearly independent vectors I have by definition) I believe I can assume n=number of linearly indepdent vectors - so A^T and$A$are square, so det makes sense, right? Because that$n$works to make the product be the square matrix G2013-11-25
4

Here's another way to look at it.

If $A$ is the matrix with columns $v_1,\ldots,v_n$, and the columns are not linearly independent, it means there exists some vector $u \in \mathbb{R}^n$ where $u \neq 0$ such that $A u = 0$. Since $G = A^T A$, this means $G u = A^T A u = A^T 0 = 0$ or that there exists a vector $u \neq 0$ such that $G u = 0$. So $G$ is not of full rank. This proves the "if" part.

The "only if" part -- i.e. if $|G| = 0$, the vectors are not linearly independent -- follows because $|G| = |A^T A| = |A|^2 = 0$ which implies that $|A| = 0$ and so $v_1,\ldots,v_n$ are not linearly independent.

3

For one thing, the Gram determinant is the square of the $n$-dimensional volume of the parallelopiped generated by the given basis vectors. The determinant is 0 when that object is not full dimensional, that is when one of the vectors lies in the hyperplane generated by the other $n-1$ vectors, in turn meaning linear dependence.