3
$\begingroup$

Most people learn in linear algebra that its possible to calculate the eigenvalues of a matrix by finding the roots of its characteristic polynomial. However, this method is actually very slow, and while its easy to remember and its possible for a person to use this method by hand, there are many better techniques available (which do not rely on factoring a polynomial).

So I was wondering, why on earth is it actually important to have techniques available to solve polynomial equations? (to be specific, I mean solving over $\mathbb{C}$)

I actually used to be fairly interested in how to do it, and I know a lot of the different methods that people use. I was just thinking about it though, and I'm actually not sure what sort of applications there are for those techniques.

  • 0
    Polynomial systems also arise in Computer Aided Geometric Design.2012-02-13

2 Answers 2

2

One important consequence of being able to explicitly solve polynomial equations is that it permits great simplifications by linearizing what would otherwise be much more complicated nonlinear phenomena. The ability to factor polynomials completely into linear factors over $\mathbb C$ enables widespread linearization simplifications of diverse problems. An example familiar to any calculus student is the fact that integration of rational functions is much simpler over $\mathbb C$ (vs. $\mathbb R$) since partial fraction decompositions involve at most linear (vs. quadratic) polynomials in the denominator. Analogously, one may reduce higher-order constant coefficient differential and difference equations (i.e. recurrences) to linear (first-order) equations by factoring them as linear operators over $\mathbb C$ (i.e. "operator algebra").

More generally, such simplification by linearization was at the heart of the development of abstract algebra. Namely, Dedekind, by abstracting out the essential linear structures (ideals and modules) in number theory, greatly simplified the prior nonlinear Gaussian theory (based on quadratic forms). This enabled him to exploit to the hilt the power of linear algebra. Examples abound of the revolutionary breakthroughs that this brought to number theory and algebra - e.g. it provided the methods needed to generalize the law of quadratic reciprocity to higher-order reciprocity laws - a longstanding problem that motivated much of the early work in number theory and algebra.

2

(This was supposed to be a comment, but it got too long.)

Robert raises a good question in the comments. Of course, general analytical expressions for roots of polynomials of high degree aren't really used much in practice, precisely because the expressions themselves are a bit unwieldy, and the special functions (theta functions, hypergeometric functions) that are involved in the closed-form expressions have to be numerically evaluated anyway, and for numerics, there's a bunch of more efficient numerical methods for getting a pile of roots than evaluating special functions.

Now, there are a number of reasons for the interest in solving polynomials numerically: for instance, the behavior of solutions to difference and differential equations can be easily analyzed by looking at the roots of a "characteristic polynomial". In signal processing and a bunch of other applications, one is often interested where in the complex plane the roots of a certain polynomial are, whether they are located within a disk, or to the left or right of a half plane (on the other hand, if one just wants to check for existence of roots in such regions, there are computationally less intensive methods). As lhf mentions, CAGD often relies on the solution of polynomials, one application being in finding intersections of shapes represented as piecewise polynomials.

Eigenvalues of matrices, pencils, and matrix polynomials have a wide variety of applications, which I won't go into detail here; just search around.

So yes, there'll always be a use for a better way to find roots of polynomials than current methods.