6
$\begingroup$

Suppose $\phi_1, \phi_2, \dots, \phi_k \in (\mathbb{R}^n)^*$, and $\mathbf{v}_1, \dots, \mathbf{v}_k \in \mathbb{R}^n$

$(\mathbb{R}^n)^*$ stands for the space of all linear transformations that goes from $\mathbb{R}^n \to \mathbb{R}$

is it true that:

$\phi_1\wedge\dots\wedge\phi_k(\mathbf{v}_1, \dots, \mathbf{v}_k) = \mathrm{det}[\phi_i(\mathbf{v}_j)]$

This is a homework question (from Multivariable Mathematics, Shifrin, Ex.18 on Page 347) and there is a hint that:we could first express all $\phi_i$ in the form of:

$\phi_i = \sum\limits_{j=1}^n a_{ij}dx_j$

where $dx_j$ is the basis 1-form

and the hint also says that it suffices to prove this equality in the case where all $v_j$ are standard basis vectors

i.e. $v_1=e_{j_1}, \dots, v_k = e_{j_k}$

From the setup of the hint, it is quite obvious that the right side equals:

$ \begin{vmatrix} a_{1j_1}&\cdots&a_{1j_k}\\ \vdots&\ddots&\vdots\\ a_{kj_1}&\cdots&a_{kj_k}\\ \end{vmatrix} $

However, I cannot understand how to show the left side equals this determinant.

Also, I am not very sure why it suffices to show that the equality holds when $v_j$ are all standard basis vectors. Is it because I could express every $v_j$ as a linear combination of the standard basis vectors?

Thank you very much!

  • 0
    To my mind, you've done very well. This is the way to go with homework questions. Thank you for accepting my answer, by the way :-)2012-11-28

2 Answers 2

1

Consider the following alternation operator: $ \operatorname{Alt}(P) = \sum_{\sigma \in S_n} (-1)^{sgn(\sigma)}P(x_{\sigma(1)}, \dots, x_{\sigma(n)}) $ where $P(x_1,\dots,x_n)$ is a polynomial of $n$ variables.

Let us regard a matrix $A=(a_{i j})_{i,j=1,\dots,k}$ as a collection of its columns: $ A = (A_1, \dots, A_k) $

Operators $dx_i$ act on these columns as $ dx_i(A_j) := a_{i j} $

The determinant of $A$ may be defined as $ \det(A) := \operatorname{Alt}(dx_1(A_1)\dots dx_k(A_k)) $ where the alternation is applied to the polynomial in variables $dx_i$ (in other words, the order of $A_j$'s is kept unchanged when alternating).

The wedge product's definition is $ \phi_1 \wedge \dots \wedge \phi_k := \operatorname{Alt}(\phi_1 \otimes \dots \otimes \phi_k) $

Substituting the data from the question shows that $ \phi_1\wedge\dots\wedge\phi_k(\mathbf{v}_1, \dots, \mathbf{v}_k) = \mathrm{det}[\phi_i(\mathbf{v}_j)] $ holds tautologically for $\phi_i = dx_i$.

1
  1. we have to show that $\phi_1\wedge \dots \wedge \phi_k(v_1, \dots, v_k) = \mathrm{det}[\phi_i(v_j)]$ is equivalent to

    $\phi_1\wedge \dots \wedge \phi_k(e_{j_1}, \dots, e_{j_k}) = \mathrm{det}[\phi_i(e_{j_l})]$, $i = 1, \dots, k$ and $l = 1, \dots k$

    Suppose $\phi_1\wedge \dots \wedge \phi_k(v_1, \dots, v_k) = \mathrm{det}[\phi_i(v_j)]$, when the $v_j$ are not linearly independent, we have both side equal to zero.

    When the $v_j$ are not linearly independent, these k vectors span a k-dimensional subspace of $\mathbb{R}^n$, let $e_{j_1}, \dots,e_{j_k}$ be the set of vectors that also span this k-dimensional subspace.

    Hence, each $e_{j_l}$ could be expressed as a linear combination of the $v_j$ vectors.

    $e_{j_1} = c_{11}v_1 + \dots + c_{1k}v_k$

    $e_{j_2} = c_{21}v_1 + \dots + c_{2k}v_k$

    $\vdots$

    $e_{j_k} = c_{k1}v_1 + \dots + c_{kk}v_k$

    Since k-forms and determinants are both multilinear,

    We perform these linear combinations on both sides of the original equation.

    $\phi_1\wedge \dots \wedge \phi_k(c_{11}v_1 + \dots + c_{1k}v_k, \dots, c_{k1}v_1 + \dots + c_{kk}v_k) = det[\phi_i(c_{l1}v_1 + \dots + c_{lk}v_k)]$

    $\phi_1\wedge \dots \wedge \phi_k(e_{j_1}, \dots, e_{j_k}) = det[\phi_i(e_{j_l})]$

    Hence, we get the conclusion that the original equality is equivalent to the equality we get by substituting the proper standard basis vectors.

  2. Now we have to explain why the equality holds.

    on the right hand side, we have:

    \begin{vmatrix} \phi_1(e_{j_1})&\dots&\phi_1(e_{j_k})\\ \vdots&\ddots&\vdots\\ \phi_k(e_{j_1})&\dots&\phi_k(e_{j_k}) \end{vmatrix}

    for each $\phi_i(e_{j_l})$, we have $\phi_i = \sum\limits_{j = 1}^n a_{ij}dx_{j}$

    Since we know that $dx$ is a linear 1-form, then $\phi_i(e_{j_l})$ is equal to the sum of each $dx_j$ applied to $e_{j_l}$.

    Because $e_{j_l}$ is a standard basis vector, there will be only one term that does not produce zero, which is $a_{ij_l}dx_{j_l}$.

    Hence, $\phi_i(e_{j_l}) = 0+\dots+0 + a_{ij_l}dx_{j_l}(e_{j_l})+0+\dots+0 = a_{ij_l}$

    Hence, the determinant on the right hand side becomes \begin{vmatrix} a_{1j_1}&\dots&a_{1j_k}\\ \vdots&\ddots&\vdots\\ a_{kj_1}&\dots&a_{kj_k} \end{vmatrix}

    Now let's examine the left hand side:

    Express each $\phi_i = \sum\limits_{j = 1}^n a_{ij}dx_j$

    Hence,

    $\phi_1\wedge \dots \wedge \phi_k = \sum\limits_{\text{all possible }I} C_I dx_I$, where $I$ is a k-tuple of natural numbers from 1 to n.

    if we apply this k-form to $e_{j_1}, \dots, e_{j_k}$, there will be exactly one $I$ which produces a non-zero number, and that $I$ is equal to $\{j_1, j_2, \dots, j_k\}$

    Hence, the only term that produces a non-zero results is the term:

    $C_{j_1, \dots, j_k}dx_{j_1}\dots dx_{j_k}(e_{j_1},\dots, e_{j_k}) = C_{j_1, \dots, j_k}$

    Now we simply need to figure out what $C_{j_1, \dots, j_k}$ is.

    We realize that $C_{j_1, \dots, j_k}$ comes from the product of the coefficients of all perumtations of $\{dx_{j_1}, \dots, dx_{j_k}\}$. Thus,

    $C_{j_1, \dots, j_k} = \sum\limits_{\text{all permutations of }\{j_1, \dots, j_k\}} sgn(\sigma)\prod_{i = 1}^k a_{i\sigma(j_i)}$,

    which is the determinant: \begin{vmatrix} a_{1j_1}&\dots&a_{1j_k}\\ \vdots&\ddots&\vdots\\ a_{kj_1}&\dots&a_{kj_k} \end{vmatrix} And this is equal to the right hand side.

    Hence, the original statement is proven.