42
$\begingroup$

Suppose $E/F$ is a field extension and $\alpha, \beta \in E$ are algebraic over $F$. Then it is not too hard to see that when $\alpha$ is nonzero, $1/\alpha$ is also algebraic. If $a_0 + a_1\alpha + \cdots + a_n \alpha^n = 0$, then dividing by $\alpha^{n}$ gives $a_0\frac{1}{\alpha^n} + a_1\frac{1}{\alpha^{n-1}} + \cdots + a_n = 0.$

Is there a similar elementary way to show that $\alpha + \beta$ and $\alpha \beta$ are also algebraic (i.e. finding an explicit formula for a polynomial that has $\alpha + \beta$ or $\alpha\beta$ as its root)?

The only proof I know for this fact is the one where you show that $F(\alpha, \beta) / F$ is a finite field extension and thus an algebraic extension.

  • 0
    Related: https://math.stackexchange.com/questions/1277753, https://math.stackexchange.com/questions/3310172018-11-29

4 Answers 4

32

The relevant construction is the Resultant of two polynomials. If $x$ and $y$ are algebraic and $P(x) = Q(y) = 0$ and $\deg Q=n$ then $z=x+y$ is a root of the resultant of $P(x)$ and $Q(z-x)$ (where we take this resultant by regarding $Q$ as a polynomial in only $x$) and $t=xy$ is a root of the resultant of $P(x)$ and $x^n Q(t/x).$

  • 0
    @Patrick da Silva -- and _you_ mean, leaving t or$z$_as a symbol_ rather than substituting the _value_ of t or z? I'm sure it _must_ mean that ... but I do agree that it could have been & ought to have been made clearer in the exposition.2018-12-14
30

Okay, I'm giving a second answer because this one is clearly distinct from the first one. Recall that finding a polynomial over which $\alpha+\beta$ or $\alpha \beta$ is a root of $p(x) \in F[x]$ is equivalent to finding the eigenvalue of a square matrix over $F$ (living in some algebraic extension of $F$), since you can link the polynomial $p(x)$ to the companion matrix $C(p(x))$ which has precisely characteristic polynomial $p(x)$, hence the eigenvalues of the companion matrix are the roots of $p(x)$.

If $\alpha$ is an eigenvalue of $A$ with eigenvector $x \in V$ and $\beta$ is an eigenvalue of $B$ with eigenvector $y \in W$, then using the tensor product of $V$ and $W$, namely $V \otimes W$, we can compute $ (A \otimes I + I \otimes B)(x \otimes y) = (Ax \otimes y) + (x \otimes By) = (\alpha x \otimes y) + (x \otimes \beta y) = (\alpha + \beta) (x \otimes y) $ so that $\alpha + \beta$ is the eigenvalue of $A \otimes I + I \otimes B$. Also, $ (A \otimes B)(x \otimes y) = (Ax \otimes By) = (\alpha x \otimes \beta y) = \alpha \beta (x \otimes y) $ hence $\alpha \beta$ is the eigenvalue of the matrix $A \otimes B$. If you want explicit expressions for the polynomials you are looking for, you can just compute the characteristic polynomial of the tensor products.

Hope that helps,

  • 0
    Definitely _clearly_ a distinct answer, that! ¶ So it looks like there is a choice then between two distnct methods: that of finding the determinant of a _dense_ (m+n)×(m+n) matrix, & that of finding the determinant of a _sparse_ mn×mn matrix. Casting in the mind the forms of the two matrices, it seems quite remarkable that they _must_ yield the same result. I wouldn't be surprised either if it should transpire that using fairly ordinary algorithms for calculation of determinant the sheer _amount_ of calculation were prettymuch exactly the the same.2018-12-14
22

Let $\alpha$ have minimal polynomial $p(x)$ and let $\beta$ have minimal polynomial $q(x)$. Then $V = F[x, y]/(p(x), q(y))$ is a finite-dimensional vector space over $F$ of dimension $\deg p \deg q$ (it is not necessarily the same dimension as $F(\alpha, \beta)$, for example when $\alpha = \beta$); moreover, it has an explicit basis $x^i y^j : 0 \le i < \deg p, 0 \le j < \deg q.$

$xy$ and $x + y$ act by left multiplication on $V$ and one can write down explicit matrices for this action in the basis above in terms of the coefficients of $p$ and $q$. Now apply the Cayley-Hamilton theorem.

This argument proves the stronger result that if $F$ is the fraction field of some domain $D$ and $\alpha, \beta$ are integral over $D$ (hence $p, q$ are monic with coefficients in $D$) then so are $\alpha \beta, \alpha + \beta$.

5

Technically, you could find the automorphisms of the Galois closure of $F(\alpha,\beta)$ over $F$ (assuming this extension is separable) and compute the polynomial $ \prod_{\sigma \in \mathrm{Gal}}(x- \sigma(\alpha+\beta)) $ or the same with $\alpha \beta$, but I don't believe this is what you are looking for. Since you can define Galois closures without knowing that $\alpha + \beta$ and $\alpha \beta$ are also algebraic, it is a legitimate way of proving it, but not a practical nor pedagogical one.

Hope that helps,

  • 1
    Hm. I realize that I need the fact that |\mathrm{Gal}(F(\alpha,\beta)/F)| (= [F(\alpha,\beta) : F]) < \infty for this construction to make sense, hence it's not really that much worth it, but at least it gives intuition.2012-06-07