Factorise the determinant $\det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^3+b^2 & b & 1 \\ c^3+c^2 & c &1\end{pmatrix}$.
My textbook only provides two simple examples.
Really have no idea how to do this type of questions..
Factorise the determinant $\det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^3+b^2 & b & 1 \\ c^3+c^2 & c &1\end{pmatrix}$.
My textbook only provides two simple examples.
Really have no idea how to do this type of questions..
$\det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^3+b^2 & b & 1 \\ c^3+c^2 & c &1\end{pmatrix}$
$=\det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^3+b^2-(a^3+a^2) & b-a & 1-1 \\ c^3+c^2-(a^3+a^2) & c-a &1-1\end{pmatrix} $ (applying $R_2'=R_2-R_1\ and\ R_3'=R_3-R_1$)
$=(b-a)(c-a) \det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^2+a^2+ab+b+a & 1 & 0 \\ c^2+a^2+ca+c+a & 1 & 0\end{pmatrix}$
$=(b-a)(c-a)\cdot1\cdot \det\begin{pmatrix} b^2+a^2+ab+b+a & 1 & \\ c^2+a^2+ca+c+a & 1\end{pmatrix}$
$=(b-a)(c-a)(b-c)(a+b+c+1)$
Regard $b$ and $c$ as constants, and the determinant as a polynomial in $a$. Then find ways of making the determinant equal to 0, and by the Factor Theorem you'll get a factor of the determinant.
Obvious choices: set $a=b$ or $a=c$ and you'll have two identical rows, so $(a-b)$ and $(a-c)$ are factors.
We can clearly see that permuting the variables is the same as permuting the rows, and hence only changes the determinant by a sign. Hence permuting the variables in any factor gives another factor, so $(b-c)$ is a factor.
What's left? Well, we can see without much difficulty that the determinant is cubic in $a$, and the coefficient of $a^3$ is $(b-c)$. We've already taken out a factor of $(b-c)$ and two monic linear factors, so whatever's left is linear and monic in $a$ and, by similar arguments, must also be so in $b$ and $c$. Hence it pretty much has to be $a + b + c + k$ for some constant $k$. So we've got as far as the following: $\det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^3+b^2 & b & 1 \\ c^3+c^2 & c &1\end{pmatrix}=(a-b)(a-c)(b-c)(a+b+c+k)$ for some $k$, and all we need to do is find $k$. I first tried setting $b=c=0$ but that just leaves you with $0k = 0$, so instead let's set $b=1$, $c=0$: $\begin{align} \det\begin{pmatrix} a^3+a^2 & a & 1 \\ 2 & 1 & 1 \\ 0 & 0 &1\end{pmatrix}&=a(a-1)(a+1+k) \\ \det\begin{pmatrix} a^3+a^2 & a \\ 2 & 1\end{pmatrix}&=a(a-1)(a+1+k) \\ a^3 + a^2 - 2a &=a(a-1)(a+1+k) \\ a^2 + a - 2 &= (a-1)(a+1+k)\end{align}$ Comparing constant terms, $-2 = -1-k$ so $k=1$, and we're done!
$\begin{align*} \det\begin{pmatrix} a^3+a^2 & a & 1 \\ b^3+b^2 & b & 1 \\ c^3+c^2 & c &1\end{pmatrix} & = \det\begin{pmatrix} a^2 & a & 1 \\ b^2 & b & 1 \\ c^2 & c &1\end{pmatrix}+\det\begin{pmatrix} a^3 & a & 1 \\ b^3 & b & 1 \\ c^3 & c &1\end{pmatrix}\\ & = -(a-b)(b-c)(c-a) -(a-b)(b-c)(c-a)(a+b+c)\\ &= (a-b)(b-c)(a-c)(a+b+c+1) \end{align*}$
Hey guys, editing my post to address concerns. You're right, I didn't put in all the details because I thought the method was clear once you've seen Vandermonde Determinants. Here it is explicitly: $\begin{align*} \det\begin{pmatrix} a^3 & a & 1 \\ b^3 & b & 1 \\ c^3 & c &1\end{pmatrix} & = \det\begin{pmatrix} a^3+(b+c)a^2 & a & 1 \\ b^3 +(a+c)b^2& b & 1 \\ c^3+(a+b)c^2 & c &1\end{pmatrix}\\ & = \sum_{cyc}\det\begin{pmatrix} a^3 & a & 1 \\ ab^2 & b & 1 \\ ac^2 & c &1\end{pmatrix} & \\ &= \sum_{cyc}a\det\begin{pmatrix} a^2 & a & 1 \\ b^2 & b & 1 \\ c^2 & c &1\end{pmatrix} \end{align*}$ (At this point we have shown the result.)
Okay, so I guessed intuitively that $\det\begin{pmatrix} (b+c)a^2 & a & 1 \\ (a+c)b^2& b & 1 \\ (a+b)c^2 & c &1\end{pmatrix}=0$But its easy to prove, by finding the eigenvector: $\begin{pmatrix}1 \\ -(ab+bc+ca) \\ abc\end{pmatrix}$