35
$\begingroup$

Inner product spaces are defined over a field $\mathbb{F}$ which is either $\mathbb{R}$ or $\mathbb{C}$.

I want to know what happens if we try to define them over some finite field. Here's an example:

Let $\mathbb{F} = \{0,1,a,b\}$ be a finite field with + and * defined by the following Cayley tables:

Cayley table for some finite field

Now, define a very simple vector space $\mathcal{V} = \{O, V\}$ over $\mathbb{F}$ as follows:

  1. $\mathcal{V}$ is an Abelian group over addition, with identity $O$. Therefore, $O+O = V+V = O$, and $O+V = V+O =V$.
  2. The scalar multiplication is governed with these rules: For any $e \in \mathbb{F}$, we have $eO = O$. Define $0V = O$, and $1V=aV=bV=V$.

One can easily check that $\mathcal{V}$ is a vector space over $\mathbb{F}$.

Now, we define an inner product for $\mathcal{V}$:

  1. $\langle O,O \rangle = 0$ and $\langle V,V \rangle = 1$;
  2. $\langle V,O \rangle = \langle O,V \rangle = 0$.

It seems that the above example demonstrates an inner product space over a finite field.

Is the above notion ever studied? Does it have any applications?

We avoided "conjugate symmetry" in the definition above, by assuming the conjugate of each member of $\mathbb{F}$ is itself. Can we define conjugation for fields other than $\mathbb{C}$? (Well, I heard the name C*-algebra, but I don't know whether it relates to my question.)

For instance, let $\mathbb{Q}[\sqrt 3] = \{a+b\sqrt 3 \mid a,b \in \mathbb{Q} \}$ be $\mathbb{Q}$ adjoined with $\sqrt 3$. For any $e = a+b\sqrt 3$ in $\mathbb{Q}[\sqrt 3]$, can we define the conjugate of $e$ as $e^* = a-b\sqrt 3$? This satisfies the condition below: Both addition and multiplication of $e$ and $e^*$ are members of the underlying subfield.

  • 0
    @Sadeq: I don't think that much harm was done. Now that you fixed the group operation my first point becomes valid, so I undelete and expose my answer to criticism.2011-07-04

3 Answers 3

16

1) The action of $F$ on $\mathcal{V}$ that you defined does not turn $\mathcal{V}$ into a vector space. If it did, we would need to have $ V=bV=(1+a)V=1V+aV=V+V=0. $

2) We usually restrict the notion of an inner product to vector spaces over a field, where the elements of the field (=the scalars) can be ordered. Most notably we want to have a set of positive elements. You hopefully remember that one of the axioms of an inner-product demands that the inner product of a non-zero vector with itself should be positive. Well, what are the positive elements of a finite field? Here we have a problem. Normally we would want to declare $1$ as a positive element. But also we want the sum of two positives to be positive, so $1+1=2$ should be positive. But in the field of four elements $1+1=0$, so $0$ should be positive?? By a similar argument non-zero vectors in a vector space over a finite field often have zero 'inner product' with itself. Doesn't look too good, does it? [Edit] As Theo points out, the reason for this is that the inner product is used to define a length of a vector (and a metric in the space), so we need to be able to do square roots. As it happens, in your field all the elements do have square roots (all the finite fields of characteristic 2 share this property), but the problem with the positive elements persists.[/Edit]

3) The closest thing to an inner product on a vector space over a finite field (or other non-ordered field) is a bilinear symmetric form. It comes together with an associated quadratic form. These are well studied objects in algebra.

4) The kind of conjugations you seem to talk about are also well studied in algebra. Many number fields have several such symmetries, and they are called automorphisms of the field. Look up field theory (or Galois theory) to learn more. Even your field of 4 elements has a non-trivial one. The mapping $F:0\mapsto, 1\mapsto 1, a\mapsto b, b\mapsto a$ has the following nice properties:

$ F(x+y)=F(x)+F(y)\qquad F(xy)=F(x)F(y) $

for all $x,y$ in your field.

  • 0
    @Sadeq: Sorry about not making it clear that (1) was meant to show that your definition leads to a contradiction.2011-07-04
7

(I'll not address your concrete example, sorry!)

Actually you need $K$ to be a $*$-division ring in order to make sense of a generalization of inner product space. You can always equip every division ring with the trivial conjugation operation $*$ that maps every element to itself. (This is not very interesting, but it shows that the need for a conjugation operation does not exclude any division rings from consideration.)

But the notion of a $*$-division ring $K$ is enough to define a $K-$ vector space $H$ as a $K-$module, and also define what a nondegenerate hermitian form on $H$ should be.

I can't tell you if this notion has any applications. I can tell you instead that it does not have any applications in quantum mechanics (one major area of applications of Hilbert spaces), because there is a very strong theorem that states that assuming some basic axioms that are needed for quantum mechanics in infinite dimensions, K has to be the real numbers, the complex numbers or the quaternions.

For more details please visit this blog post: Soler's theorem, nCafe.

4

There is a certain generalization of inner product spaces but it's not quite what you want: Hilbert C* modules are (essentially) Hilbert spaces (i.e. complete inner product spaces) over arbitrary C*-algebras. Finite fields are definitely not C*-algebras, but you can have (among other things) $\mathbb{C}^n$ (and in particular $\mathbb{C}$, the classical case), $\ell^2$, $L^2 [0,1]$, etc. The theory is quite well-developed. You have a Cauchy-Schwarz inequality, an induced norm and metric, and so on. You can read more about it in Wiki:

http://en.wikipedia.org/wiki/Hilbert_C*-module