7
$\begingroup$

I'm learning Computer Algebra and met an exercise asking me to prove that $ \operatorname{res}(fg,h)=\operatorname{res}(f,h)\cdot\operatorname{res}(g,h) $ where $f(x)$, $g(x)$ and $h(x)$ are polynomials and $\operatorname{res}$ stand for resultant.

I know if we use the following fact: $\operatorname{res}(f,g)=\operatorname{lc}(f)^{\operatorname{deg}(g)}\operatorname{lc}(g)^{\operatorname{deg}(f)}\prod_{(x,y):f(x)=0,g(y)=0} (x-y)$ the proof would become obvious.

However, in our book the resultant was defined as the determinant of the Sylvester matrix. So I just want to find a proof using this definition directly. (I don't want to prove the fact above first.)

(Supposing that $\deg f=m,\deg g= n, \deg h = p$)
I first tried the multiplication of matrices but found that the Sylvester matrix of $(f,h)$ is $(m+p)\times(m+p)$, the Sylvester matrix of $(g,h)$ is $(n+p)\times(n+p)$, thus they cannot be multiplied. I even tried to extend their Sylvester matrix to $(m+n+p)\times(m+n+p)$. But I still can't get any useful result.

Can you please help? Thank you!

1 Answers 1

3

The identity you are asking is an immediate consequence of the following interpretation of the resultant (since $M_{fg}=M_fM_g$ and $\deg(fg)=\deg(f)+\deg(g)$):

Proposition. For nonzero polynomials $h,f$ in $X$ over a field $K$, one has $ \mathrm{res}(h,f)=\mathrm{lc}(h)^{\deg(f)}\det(M_f), $ where $M_f$ is the endomorphism of the $K$-vector space $K[X]/(h)$ defined by multiplication by (the image modulo $h$ of) $f$.

To prove this, fix the polynomial $h=h_mX^m+\cdots+h_0$, fix another $n\in\mathbf{N}$ (later taken to be the degree of $f$), and consider any $m$-tuple $(P_1,\ldots,P_m)$ of polynomials of degree less than $m+n$, say $P_i=\sum_{j=0}^{m+n-1}c_{i,j}X^j$. Let $(R_1,\ldots,R_m)$ be their images in $K[X]/(h)$; expressed on the basis $(X^{m-1},\ldots,X^0)$ of that space, the coefficients of $R_i$ are those of the remainder of $P_i$ after division by $h$. Now I claim that $h_m^n\det\nolimits_{(X^{m-1},\ldots,X^0)}(R_1,\ldots,R_m)$ equals

$ \left| \begin{matrix} h_m&h_{m-1}&\ldots&h_0&0&\ldots&0\\ 0&h_m&h_{m-1}&\ldots&h_0&\ddots&\vdots \\ \vdots&\ddots&\ddots&\ddots&\ddots&\ddots&0\\ 0&\ldots&0&h_m&h_{m-1}&\ldots&h_0\\ c_{1,m+n-1}&\ldots&&&\ldots&c_{1,1}&c_{1,0}\\ c_{2,m+n-1}&\ldots&&&\ldots&c_{2,1}&c_{2,0}\\ \vdots &&&&&&\vdots \\ c_{m,m+n-1}&\ldots&&&\ldots&c_{m,1}&c_{m,0}\\ \end{matrix} \right|. $ It suffices to check that as a function of $(P_1,\ldots,P_m)$ the latter determinant

  • does not change when reducing any of its polynomials modulo $h$, so that it defines a function on $(K[X]/(h))^m$ (this is a consequence of the presence of the first $n$ rows);
  • is $m$-linear over $K$ and alternating;
  • takes the value $h_m^n$ when $(P_1,\ldots,P_m)=(X^{m-1},\ldots,X^0)$.

Now to obtain the proposition, apply this with $n=\deg(f)$ and $P_i=fX^{m-i}$ for $i=1,\ldots,m$; in this case $\det\nolimits_{(X^{m-1},\ldots,X^0)}(R_1,\ldots,R_m)$ computes $\det(M_f)$ on the basis $(X^{m-1},\ldots,X^0)$.