2
$\begingroup$

Let $D$ be a Euclidean domain whose function δ satisfies: $$ (1) \qquad δ(ab)=δ(a)δ(b)$$ $$ (2)\qquad δ(a+b)≤ \max(δ(a),δ(b)). $$ Show that either $D$ is a field or $D=F[x]$, $F$ a field, $x$ an indeterminate. Thanks for reading!

  • 0
    @Arturo: To be fair, this is only gottigen's second question and the first wasn't in the imperative, so there was little opportunity in the intervening 8 months to make this observation :-)2011-10-13
  • 0
    @Arturo To be fair, you should say "is not very well-received by *some* members".2011-10-13
  • 2
    Fine: @gottigen: You've been a member of this site for 8 months; if you've looked around a bit after you posted your first question in February, you probably noticed that posting in the imperative (as if you were assigning homework) is not well-received by some members. Poor formatting is also a turn-off to some. A good way of encouraging good and useful answers is to indicate what your thought are so far, in what context you encountered this problem, and maybe what background you have (so the answer can be stated at the appropriate level). Please try to do all that here.2011-10-13
  • 0
    @Arturo Magidin: I feel quite sorry for that.2011-10-14

1 Answers 1

3

First, recall that the elements which take on the minimum value of $\delta$ are the units [$u$ a unit iff $\delta(u) \leq \delta(x)$ for all non-zero $x$ in $D$.]

Note that $\delta(1)=0$ implies that $\delta(a)=\delta(a)\delta(1)=0$. Thus if $\delta$ takes on the value 0, everything takes on the value $0$ and so $D$ is a field. Let's suppose $D$ is not a field. [Thus $\delta(1)>0$] Since $0<\delta(1)=\delta(1\cdot 1)=\delta(1)\delta(1)$ we must have $\delta(1)=1$.

Let $\mathbb{F}$ be the units of $D$ along with $0$. This set is closed under multiplication. Suppose $a,b \in \mathbb{F}-\{0\}$. Either $a-b=0 \in \mathbb{F}$ or $a-b \not=0$ and so $\delta(a-b) \leq \mathrm{max}\{\delta(a),\delta(-b)\}=\delta(a)=\delta(-b)=\delta(1)=1$ [since $a$, $-b$, and $1$ are units which share the min value of $\delta$]. Thus $\mathbb{F}$ is closed under subtraction. Thus it is a field (all its non-zero elements are units).

Let $0 \not= t \in D$ be a non-unit element of minimal $\delta$-value (the well ordering principal says such an $t$ exists). Now $t$ is not a unit so $0<1=\delta(1)<\delta(t)$ and so $\delta(t)>1$. This implies that $\delta(t)<\delta(t)\delta(t)=\delta(t^2)<\delta(t)\delta(t)\delta(t)=\delta(t^3)<\cdots$. Now if $t$ were algebraic over $\mathbb{F}$, it would be the root of some monic polynomial $g(x)\in\mathbb{F}[x]$, say $g(x)=x^m+b_{m-1}x^{m-1}+\cdots+b_0=x^m+h(x)$. $\delta(-h(t))$ is at most the maximum of all $\delta(-b_kt^k)=\delta(-b_k)\delta(t^k)=\delta(t)^k$ (recall $-b_k$ is a unit so $\delta(-b_k)=1$) so this is at most $\delta(t)^{m-1}$. Thus we have found $t^m=-h(t)$ and $\delta(t^m)=\delta(t)^m \leq \delta(t)^{m-1}$ (contradiction). Therefore, $t$ is transcendental over $\mathbb{F}$. Therefore, $\mathbb{F}[t]$ is a ring of polynomials over $\mathbb{F}$.

Finally, suppose $0 \not= a \in D$. Dividing by $t$ gives $a=qt+r_0$ where either $r_0=0$ or $\delta(r_0)<\delta(t)$. But $\delta(t)$ had minimal $\delta$-value among non-units. Thus either $r_0=0$ or $r_0$ is a unit, so $r_0 \in \mathbb{F}$. Next, note that $\delta(q) < \delta(q)\delta(t)=\delta(qt)=\delta(a-r_0)\leq \delta(a)$ (by property (2)). Now divide $q$ by $t$ and continue inductively. Eventually $\delta$ bottoms out and this process stops. We have just shown that $a=(\cdots (r_\ell t+r_{\ell-1})t\cdots)t+r_0 \in \mathbb{F}[t]$. Therefore, $D=\mathbb{F}[t]$.

  • 0
    Hello. Would you mind explaining why $D$ is a field if $\delta$ is the zero function? Isn't it implicit in the definition of a Euclidean domain that $\delta$ is not the zero function? (the existence of $q,r$ such that $a = bq + r$ **and** $\delta(r) < \delta(b)$)2017-07-22
  • 0
    First, you should be aware that there is more than one accepted definition of the term "Euclidean domain". Although, each of these definitions are equivalent.2017-07-22
  • 0
    Now given $a$ and $b$ where $b \not=0$, there exists $q,r$ such that $a=bq+r$ and $\delta(r)<\delta(b)$ **OR** $r=0$. So if $\delta(b)=0$ (always), we can never get "$\delta(r)<\delta(b)$" so we are always forced to have $r=0$. This means that whenever we divide we get $a=bq$.2017-07-22
  • 0
    So take any $b \not=0$ and use $a=1$. Then there exists $q,r$ such that $1=bq+r$ and since $\delta(b)=0$ we are forced to have $r=0$. Thus $1=bq$. In other words, $b^{-1}=q$ exists.2017-07-22
  • 0
    I see. Thanks a lot!2017-07-22