4
$\begingroup$

Suppose that we have a commutative, associative ring $R$ which we use to generate the polynomial $R[x]$. Then $p(x) \in R[x]$ is of the form $p(x) = \sum_{i=0}^n a_i x^i$ for some $n \in \mathbb{N}$.

Now, I wish to understand why $R[x]$ is itself a commutative, associative ring. I am happy showing all of the necessary properties to show this to be the case, except for associativity of multiplication.

I understand that given $p(x), q(x) \in R[x]$, such that

$$p(x) = \sum_{i=0}^n a_i x^i, \ \ q(x) = \sum_{i=0}^m b_i x^i$$

for some $m, n \in \mathbb{N}$, where $a_i, b_i \in R, \forall i$, we may (quite naturally) define their product

$$p(x)q(x)= \left(pq \right)(x):=\sum_{k=0}^{n+m} \sum_{i+j=k, \ i,j \in \mathbb{N}_0} a_i b_j x^k.$$

However, I get rather lost when I introduce another polynomial, $r(x) \in R[x]$ such that

$$r(x) = \sum_{i=0}^o c_i x^i$$

for some $o \in \mathbb{N}$, where $c_i \in R, \forall i$ and attempt to show that

$$\left(p(x) q(x) \right) r(x) = p(x) \left(q(x) r(x) \right).$$

Would any kindly soul be able to show me how?

  • 0
    I think that these proofs are harder to write down than to be thought. Suggestion: write down an explicit example and convince yourself that there are no problems.2011-10-31
  • 0
    Why do you get lost? With the approach you've taken, it does take a little perseverance, but that's about it. Write explicitely what $(pq)r$ is. Do the same for $p(qr)$. Compare!2011-10-31
  • 2
    I think that Valerio's suggestion is a bad one: one should be able to write complete proofs of such basic facts. "Convincing" oneself should mean "convince oneself that one can actually prove" and, until one has considerable experience (I have not gotten there...) that means in most cases write the complete proof. I have encountered uncountably many instances of people who'd convinced themselves of things they could not prove...2011-11-01
  • 0
    Can we consider polynomials as functions $R \to R$ and in this case the result simply follows from the associativity of mulplication in $R$?2011-11-01
  • 0
    I'm one of them, sure :)2011-11-01
  • 0
    I have tried Mariano, I promise! I just get rather confused by the fact that the coefficients of $pq(x)$ are indexed by two variables, $i$ and $j$, whereas the definition of polynomial multiplication I've given assumes a single index for the coefficients for the component polynomials. I also have a feeling I will end up with a 'quadruple sum' (a double sum within a double sum) and I'm a little fearful about how to algebraically manipulate such an object.2011-11-01
  • 0
    @Harry Williams: The index manipulation is unpleasant. One thing you might do is to let one of the polynomials, say the first, be the monomial $a_px^p$ (and there is really no need to carry the $a_p$ around).2011-11-01
  • 0
    1. Try proving commutativity before associativity -- there's less bookkeeping in the former case. 2. Two polynomials are equal if and only if all their coefficients are equal. What is the $k$-th coefficient of $\sum_{i} a_i x^i \cdot \sum_{j} b_j x^j$? Compare that with the $k$-th coefficient of the product in the other order.2011-11-01
  • 0
    @Ali this isn't always true. For example, the ring $\text{Func}(\mathbb{Z}_3)$ has cardinality at most $27$ (much less of course) whereas $\mathbb{Z}_3[x]$ is countably infinite. But, if $R$ is an infinite integral domain then $R[x]\cong\text{Func}(R)$.2011-11-01
  • 0
    For what it's worth, a direct computation of the associative law for polynomials over a ring can be found on page 200 as Theorem 22.2 of Fraleigh's _A First Course in Abstract Algebra_.2011-11-01

3 Answers 3