4
$\begingroup$

I found a better proof of the theorem in Serge Lang - Algebraic Number Theory but I put in bold the parts I don't understand. Hoping for any explanations of these points.


The trace $Tr : L \to K$ is linear and nondegenerate in the sense that there exists an $x \in L$ such that $Tr(x) \not = 0$. If $\alpha$ is a nonzero element of $L$ then $x \mapsto Tr(\alpha x)$ is an element of the dual space of $L$ (as a $K$-vector space) and it is a homomorphism from $L$ to the dual space.

Since kernel is trivial it follows that $L$ is isomorphic to it's dual under the bilinear form $(x,y) \mapsto Tr(xy)$. I don't understand this part, trivial kernel only gives injectivity. By rank-nullity I know the image of the map has the same dimension as $L$ but I would also need to know that a-priori to have surjectivity?

Let $\{w'_1,\ldots,w'_n\}$ be the dual basis of $\{w_1,\ldots,w_n\}$ satisfying $Tr(w'_i w_j) = \delta_{ij}$. I suppose the $w'_1$ are elements of $L$ which represent elements of the dual space, but I don't really understand what they are and how we can make sure the $\delta_{ij}$ condition is satisfied. Let $c \not = 0$ be an element of $A$ such that each $cw'_i$ is integral.

  • 1
    What are your definitions? And when you say orthonormal, do you mean orthonormal with respect to your bilinear form $t$?2012-11-06
  • 1
    If you start with a nondegenerate bilinear form on a vector space $V$ (say finite dimensional), then this would mean that $V \to V^*$ coming from the bilinear form is an isomorphism, then the existence of dual basis is easy.2012-11-06
  • 0
    @Sanchez, thank you very much but I don't understand this. In my case I have a finite seperable extension $L$ of a number field $K$, a nondegenerate $t_{L/K} : L \times L \to K$ and a basis for $L/K$ (I think that's a basis for $L$ with coefficients in $K$).2012-11-06
  • 0
    @Sanchez, I think it's just orthonormal with respect to the trace form.2012-11-06
  • 0
    Okay, then try to do this exercise.2012-11-06
  • 0
    @Sanchez, which?2012-11-06
  • 2
    Let $V$ be a finite dimensional vector space over a field $K$, such that there is a symmetric bilinear form $t: V \times V \to K$ satisfying the induced map, $V \to V^*$ is an isomorphism. (where the map is defined by $v \to t_v$, and $t_v ( w ) = t(v,w)$. Show that if $v_1,\cdots,v_n$ is a basis of $V$, there exists a dual basis $w_1,\cdots,w_n$ of $V$, such that $t(v_i,w_j) = \delta_{ij}$.2012-11-06
  • 0
    Hint: If $w_j$ exists, what must $t_{w_j}$ be?2012-11-06
  • 0
    @Sanchez, I had a go proving it.2012-11-06
  • 2
    I wonder whether you, @sperner, are missing something fundamental. You understand that $L$ is a vector space over $K$ of dimension $n$, right? And you do know that $L$ and its $K$-dual have the same dimension? And you know that a homomorphism between two vector spaces of the same dimension is one-to-one if and only if it’s a surjection?2012-11-13

1 Answers 1

5

As commenters say, $L^* = \hom_K(L,K)$ is a vector space over $K$ of dimension $n$, the same dimension as $L$ over $K$. You have a $K$ linear map $T : L \to L^*$, such that $T(x)$ is the linear form defined by $T(x)(y) = Tr(xy)$.

Since there is a $y \in L$ such that $Tr(y) \neq 0$, $T(1)$ is not the zero map of $L^*$. Since $L$ is a field, for any $\alpha \neq 0$, multiplication by $\alpha$ is a permutation of $L$, so $T(\alpha)$ is again nonzero : $T(\alpha)(\alpha^{-1}y) = T(1)(y) \neq 0$. This implies that the map $T$ is injective, and since $L$ and $L^*$ have the same dimension, $T$ has to also be surjective : it is an isomorphism.

You have a basis $(w_1 \ldots w_n)$ of $L$ over $K$. Its dual basis is the basis $(w'_1 \ldots w'_n)$ of $L^*$ over $K$ such that $w_i'(w_j) = \delta_{i,j}$. This means that if $x \in L$, if you write $x = \sum_j c_j w_j$, then $w_i'(x) = w_i'(\sum_j c_j w_j) = \sum_j c_j \delta_{i,j} = c_i$ : the linear map $w_i'$ is simply the map "$i$-th coordinate in the basis $(w_1 \ldots w_n)$".

And indeed, the family $(w_1' \ldots w_n')$ generate $L^*$ : If $\phi \in L^*$, look at the numbers $\phi(w_i)$ and define $\psi = \phi - \sum_i \phi(w_i)w_i'$. Forall $j$, $\psi(w_j) = \phi(w_j) - \sum_i \phi(w_i)w_i'(w_j) = \phi(w_j) - \phi(w_j) = 0$, and because the $(w_j)$ generate $L$, we must have $\psi(x) = 0$ forall $x\in L$. It is simple to check that the family is free, so $L^*$ really has dimension $n$.

Now that you have an isomorphism $T : L \to L^*$, you can define $w_i^* = T^{-1}(w_i')$, which means that $w_i^*$ is the element of $L$ such that $T(w_i) = w_i'$, which means that $Tr(w_i^* w_j) = \delta_{i,j}$.

If you write $w_i^* = \sum_k c_{i,k} w_j$, this means that the numbers $c_{i,k}$ satisfy the equations $\sum_k c_{i,k} Tr(w_kw_j) = \delta_{i,j}$ for all $j$. In fact if you denote $M$ the matrix $(Tr(w_kw_j))$ and $C$ the matrix $(c_{i,k})$, these equations are merely stating that $C M = I_n$ : you find the coefficients of the $w_i^*$ (in the basis $(w_1 \ldots w_n)$) in the inverse of the matrix $M$.

Even if you are skeptical of anything having to do with $L^*$, you should still be convinced that $\det(M) \neq 0$ (and hence is invertible): for any nonzero $a = \sum_i a_i w_i \in L$, remember the element $y$ whose trace is nonzero, and let $a^{-1}y = \sum_j b_j w_j$. Now compute $(a_1 \ldots a_n)^\bot M (b_1 \ldots b_n) = \sum_{i,j}a_ib_j Tr(w_iw_j) = Tr((\sum_i a_i w_i)(\sum_j b_j w_j)) = Tr(a (a^{-1}y)) = Tr(y) \neq 0$.

If $M$ wasn't invertible, there would be an element $a \in L$ such that $(a_1 \ldots a_n)^\bot M = 0$, but this is impossible because there is a vector whose image by this is nonzero (this is a direct translation of the previous argument for the injectivity of $T$).

Also, if you change your basis, $\det(M)$ is multiplied by the square of the determinant of the change of basis matrix, so the squarefree part of $\det(M)$ is independant from the choice of basis, and is related to the discriminant of $L$ over $K$ (if you pick an integral basis, it is the discriminant), so $\det(M)$ is something important.