12
$\begingroup$

Here is an interesting problem. Perhaps someone would be so kind as to give me a shove in the right direction?.

If $ax^{3}+3bx^{2}+3cx+d$ and $ax^{2}+2bx+c$ share a common root, then prove that $$(ac-b^{2})(bd-c^{2})\geq 0$$

I thought about equating coefficients somehow, but that got messy.

I used the quadratic formula on the given quadratic to find

$\displaystyle x=\frac{-b\pm\sqrt{b^{2}-ac}}{a}$ are two roots.

So, if the cubic shares on of these, then I should be able to sub this in for x in the cubic.

Upon doing so, I got:

$$\frac{2(ac-b^{2})\sqrt{b^{2}-4ac}}{a^{2}}-\frac{3bc}{a}+\frac{2b^{3}}{a^{2}}+d$$

This is where I got hung up. This may not even by a good way to go about it.

I see a part of what is to be proven in the above expression, $ac-b^{2}$

Setting it to 0 does not help much.

I also thought about dividing them. If they share a root, then it should reduce to a quadratic in the numerator and a linear denominator. But, then what?.

Can anyone give a hint as to the best way to proceed?.

  • 0
    Heard about resultants?2011-06-02

2 Answers 2

9

We begin by using a single step of the Euclidean algorithm. Let $f(x) = ax^3 + 3bx^2 + 3cx + d$ and $g(x) = ax^2 + 2bx + c$. If $f$ and $g$ share a common root, then the polynomial $$ h(x) \;=\; f(x) - xg(x) \;=\; bx^2 + 2cx + d $$ must share that root as well. Now, the roots of $g(x)$ are real when $ac-b^2 \leq 0$, and complex when $ac-b^2 > 0$. Similarly, the roots of $h(x)$ are real when $bd-c^2 \leq 0$, and complex when $bd-c^2 > 0$. If these two polynomials share a common root, it follows that $ac-b^2$ and $bd-c^2$ are either both positive or both nonpositive, and therefore $(ac-b^2)(bd-c^2)\geq 0$.

  • 0
    Thanks very much. Nice. I played around with the discriminant because I noticed $ac-b^{2}$ was just the negative of $b^{2}-4ac$, but I failed to put it together. I certainly did not notice the f(x)-xg(x). I subtracted them, but did not multiply the quadratic by x.2011-06-02
7

Hint: Notice anything coincidental about the derivative of the cubic?

Suppose that $\phi$ is the common root. Let $f(x)=ax^3+3bx^2+3cx+d$ and let $g(x)=ax^2+2bx+c$. Notice that $f^'(x)=3g(x)$ so that $f'(\phi)=0$ as well. Hence $\phi$ is a double root of $f$. Since $f$ is a cubic, and complex roots come in pairs, it follows that $\phi$ is real, and hence all the roots of these polynomials are real.

In particular this implies something a bit stronger, that both $c^2-bd\geq 0$ and $b^2-ac\geq 0$. (To get $b^2-ac\geq 0$, look at the discriminant. I leave showing that $c^2-bd\geq 0$ to you. (I have a solution if you really want.))

Hope that helps,

Edit: Why do we have $c^2-bd\geq 0$? Here is the immediate brute force way, there is probably a nicer solution. The cubic has $\phi$ as a root with multiplicity $2$, and a third root, call it $\gamma$. Then since $c=a \frac{\phi^2 +\phi\gamma+\phi\gamma}{3}$, $b=-a\frac{\phi+\phi+\gamma}{3}$ and $d=-a\phi^2\gamma$ it follows that $c^2 - bd \geq 0$ is equivalent to $$\phi^2 \gamma \left(\frac{\phi+\phi+\gamma}{3}\right)\leq\left(\frac{\phi^2+\phi \gamma+ \phi\gamma}{3}\right)^2.$$ Dividing by $\phi^2$ and multiplying by $9$, we get $$3\gamma (2\phi + \gamma) \leq (\phi+2\gamma)^2,$$ which is then equivalent to $$0\leq\phi^2 -2\phi\gamma+\gamma^2.$$ This last line clearly holds since it is a square.

  • 0
    Thanks very much. I suppose the $c^{2}-bd$ comes from the discriminant of $bx^{2}+2cx+d$ that JB mentioned. I played around with the discriminants but failed to put it all together. Thanks much.2011-06-02
  • 0
    @Cody: I added a solution explaining why $c^2-bd\geq 0$. It is the brute force approach, and just follows from writing things in terms of the roots.2011-06-02
  • 0
    Wow, very nice Eric. Thanks. May I ask one more question?. How did you know what b,c, and d were in terms of the roots you called phi and gamma?. That's interesting. But how did you arrive at those?. Sorry if this is a stupid question.2011-06-02
  • 0
    @Cody: It is a good question. So we have the root $\phi$ twice, and $\gamma$ once, so we _must_ have $$a(x-\phi)^2(x-\gamma)=ax^3+3bx^2+3xc+d.$$ From here you can just expand, I get what I wrote above. But this isn't I did it, I wrote it down immediately by knowing Vieta's Relations for the roots of a polynomial. They are useful, and intuitive: http://www.artofproblemsolving.com/Wiki/index.php/Vieta%27s_Formulas2011-06-02
  • 1
    Wow, I just used what you wrote to try to derive c,b, and d, but I did not get that. I neglected to include the leading 'a' on the LHS. I need to brush up on some polynomial tricks. Thanks much. Fun problem. I learned something.2011-06-02
  • 0
    @Cody: Since the inequality would have the $a$'s cancel no matter what I didn't pay too much attention. On a second glance, it appears that the $a$'s should be in the numerators.2011-06-02
  • 1
    Yes, Eric. I had equated coefficients and derived b,c, and d. It was in front of my nose all along. I also looked up Viete's formula, and I see what you mean. I know those for a quadratic, but not for the cubic. Thanks. I learn something every day. By equating coefficients, I got $c=\frac{a({\phi}^{2}+2{\phi}{\gamma})}{3}$ and so on. I got the same with Viete. Thanks for that.2011-06-02