2
$\begingroup$

if I have a finite dimensional complex vector space $V$ with a lattice $\Gamma$ in $V$, then I consider the complex linear span in $V\oplus \overline{V}$ of the elements $\gamma \oplus (-\bar{\gamma}$) with $\gamma \in \Gamma$.

How can one describe this span? Is the result the whole $V\oplus \overline{V}$ or smaller?

Remark: a lattice is a full lattice, i.e. free subgroup generated by a real basis of $V$.

Thanks!

1 Answers 1

1

I'm not quite sure what $\overline{V}$ stands for. Presumably it is just another copy of $\mathbf{C}^n$, where $n=\dim_{\mathbf{C}} V$, and the bar is there just as a reminder that we later get the components of a vector of that part by negating the conjugates of the components of the first part. An alternative interpretation could be that the scalar multiplication in the bar-version of $V$ is preceded by complex conjugation. IOW if $y\in\overline{V}$, then the components of $w\cdot y$ are the components of $y$ multiplied by $\overline{w}$.

The answer to your question depends on what you meant with $\overline{V}$.

First assume that scalar multiplication on $\overline{V}$ is the usual one. Then it follows that your set of vectors spans all of $V\oplus V$. The key is the following.

Lemma. Assume that the vectors $\gamma_1,\gamma_2,\ldots,\gamma_k\in V$ are linearly independent over $\mathbf{R}$. Then the vectors $(\gamma_1,-\overline{\gamma_1}), (\gamma_2,-\overline{\gamma_2}),\ldots,(\gamma_k,-\overline{\gamma_k})$ are linearly independent over $\mathbf{C}$.

Proof. Let us study an eventual linear dependency relation $ \sum_j z_j(\gamma_j,-\overline{\gamma_j})=0 $ with some unknown complex numbers $z_j,j=1,\ldots,k$. Then we get two equations: $\sum_j z_j\gamma_i=0$ from the $V$-part as well as the equation $\sum_j z_j\overline{\gamma_i}=0$ from the $\overline{V}$-part. By conjugating the latter equation componentwise we get $\sum_j \overline{z_j}\gamma_j=0$. By taking the sum and the difference of these two vector equations in $V$ we get $ \sum_j(z_j+\overline{z_j})\gamma_j=0\quad\text{and}\quad \sum_j(z_j-\overline{z_j})\gamma_j=0. $ Together with the assumption that the vectors $\gamma_j$ are independent over $\mathbf{R}$ the first equation tells us that real parts of all the coefficients $z_j$ must vanish. The second equation (after cancelling the scalar factor $i$) says the same about the imaginary parts. Q.E.D.

The claim then follows by applying the Lemma to a $\mathbf{Z}$-basis of $\Gamma$.

OTOH, if scalars act on the $\overline{V}$ components as complex conjugates, then their span falls way short. For example, if $V=\mathbf{C}^1$ and $\Gamma=\{a+bi\mid a,b\in \mathbf{Z}\}$, then any two vectors $(z_1,-\overline{z_1})\neq0, (z_2,-\overline{z_2})$ are linearly dependent: $ w\cdot(z_1,-\overline{z_1})=(wz_1,-\overline{w}\overline{z_1})=(wz_1,-\overline{wz_1})= (z_2,-\overline{z_2}), $ if $w=z_2/z_1$.

So: How does a scalar $w$ act on the $\overline{V}$ component?