51
$\begingroup$

In this document on page $3$ I found an interesting polynomial: $f_n=\frac{(x+1)^n-(x^n+1)}{x}.$ Question is whether this polynomial is irreducible over $\mathbf{Q}$ for arbitrary $n \geq 1$.

In the document you may find a proof that polynomial is irreducible whenever $n=2p , p \in \mathbf{P}$ and below is my attempt to prove that polynomial isn't irreducible whenever $n$ is odd number greater than $3$. Assuming that my proof is correct my question is :

Is this polynomial irreducible over $\mathbf{Q}$ when $n=2k , k\in \mathbf{Z^+}$ \ $\mathbf{P} $ ?


Lemma 1: For odd $n$, $a^n + b^n = (a+b)(a^{n-1} - a^{n-2} b + \cdots + b^{n-1})$.

Binomial expansion: $(a+b)^n = \binom{n}{0} a^n + \binom{n}{1} a^{n-1} b + \cdots + \binom{n}{n} b^n$. $\begin{align*} f(x) &= \frac{(x+1)^n - x^n-1}{x} = \frac{(x+1)^n - (x^n+1)}{x} \\ &= \frac{(x+1) \cdot (x+1)^{n-1} - (x+1)(x^{n-1}-x^{n-2}+ \cdots + 1)}{x} \\ &= \frac{(x+1) \cdot (x+1)^{n-1} - (x+1)(x^{n-1}-x^{n-2}+ \cdots + 1)}{x} \\ &= \frac{(x+1) \cdot \left[ \Bigl(\binom{n-1}{0}x^{n-1}+ \binom{n-1}{1} x^{n-2} + \cdots + 1 \Bigr) - (x^{n-1}-x^{n-2}+ \cdots + 1) \right]}{x} \\ &= \frac{(x+1) \cdot \left[ \Bigl(\binom{n-1}{0}x^{n-1}+ \binom{n-1}{1} x^{n-2} + \cdots + \binom{n-1}{n-2} x \Bigr) - (x^{n-1}-x^{n-2}+ \cdots - x) \right]}{x} \\ &= (x+1) \cdot \small{\left[ \left(\binom{n-1}{0}x^{n-2}+ \binom{n-1}{1} x^{n-3} + \cdots + \binom{n-1}{n-2} \right) - (x^{n-2}-x^{n-3}+ \cdots - 1) \right]} \end{align*}$

So, when $n$ is an odd number greater than $1$, $f_n$ has factor $x+1$. Therefore, $f_n$ isn't irreducible whenever $n$ is an odd number greater than $3$.

  • 0
    The link in the question is dead. According to Sage $f_n$ is irreducible for all even $n$ from $4$ to $1000$.2017-05-03

1 Answers 1

3

For even $n$ we bound the function from below to show it has no real roots and thus no rational roots.

First, note that for $x\ge 0$ $f_n(x) = \sum_{k=0}^{n-2}{n\choose k+1} x^k = n + \ldots \ge n,$ where the function expressed by $\ldots$ is clearly positive semidefinite. (We take the sum above to be the definition of $f_n$ as implied by the problem statement, i.e, $f_n(0)=n$.) Thus, the function has no real roots for $x\ge 0$.

One can show that $f_n(x) = \sum_{j=0}^{(n-2)/2} \frac{n(n-j-2)!}{(j+1)!(n-2j-2)!} (-x)^j (x+1)^{n-2j-2}.$ For even $n$ and $x\in (-\infty,-1)\cup(-1,0)$ each term in the sum is explicitly positive definite. In addition, $f_n(-1)=2$. Therefore, for even $n$, $f_n$ has no real roots.