1
$\begingroup$

My question is really simple:

Let $E$ be a vector space and $A_r(E)$ be the vector space of the alternating $r$-linear maps $\varphi:E\times\ldots \times E\to \mathbb R$. If $v_1,\ldots,v_r$ are linearly independent vectors. Can we get $\omega\in A_r(E)$ such that $\omega(v_1\ldots,v_r)\neq 0$? Is the converse true?

  • 0
    What's $E$? Some finite-dimensional vector space? Perhaps an arbitrary vector space?2017-02-05
  • 0
    @JohnHughes an arbitrary vector space2017-02-05

2 Answers 2

0

Hint: at least in the case where $E = \Bbb R^k$, consider, for nonzero $u \in E$ the map $\phi_u$ with matrix $u^t$ (i.e., it's just a multiple of "project onto $u$". $\phi_u$ is a 1-linear map on $E$. What's $\phi_{v_1} \wedge \ldots \wedge \phi_{v_r}$ look like when applied to your vectors? If that's too tough, suppose that the $v_j$ are pairwise orthogonal.

0

If $n\geqslant r,$ then there are such functionals (but there is unique up to constant) and they can be written down directly using any base. More precisely we can choose $\{v_1,\dots,v_r\}\cup\{v_{r+1},\dots,v_n\}$. (Compare with John's answer)

If $n< r,$ then there are no $r$ linear independent vectors, so the answer is also yes (because antecedent of implication is false)

If $\dim(E)=\infty,$ then the aswer to your question depends on wheater you assume axiom of choice (AC) or not.

If you assume AC, then the answer is yes, because we can complete $\{v_1,\dots,v_r\}$ to base $\{v_i\}_{i\in I}$ and put $\omega(v_1,\dots,v_r)=1$ and $0$ on the rest.

If you do not assume AC, then there is no way to construct such $\omega.$


Converse is always true. You prove it by assuming that $v_1,\dots,v_r$ are lineary dependen and then you plug this vectors to $\omega.$