25
$\begingroup$

My linear algebra professor proved that

Diagonalizable matrices with complex values are dense in set of $n \times n$ complex matrices.

He defined a metric (I believe) that was somehow related to the usual metric on $\mathbb{R}^{n^2}$.

Then somehow proved that diagonalizable matrices were dense because for any matrix $A$ if $\det(A - \lambda I) = 0$ on an open subset, then $\det(A - \lambda I)$ was the zero polynomial.

I Googled around a bit and found some stuff talking about the Zariski topology, and I am not sure this is what I want. Most of what I have found on this topology is much more general than what he was doing.

Does anyone know any resources that would explain how to go about proving this statement?

Edit: I found out how to prove this the way my professor did.

We define the norm of a matrix by $$ |A| = \max\{|Ax| : |x| = 1 \}.$$ Now we have a distance $d(A, B) < \epsilon$. You can prove that if $(A - B)_{ij} < \epsilon/n$, then $|A - B| < \epsilon$.

Let $p(x)$ be the characteristic polynomial of $A$, an $n \times n$ matrix. $$p(x) = \prod_1^n (x - x_i) = x^n + (-1)\sigma_1 x^{n - 1} + \cdots + (-1)^n\sigma_n$$ where $\sigma_1, \ldots, \sigma_n$ are the elementary symmetric polynomials of $x_1, \ldots, x_n$ which are the eigenvalues.

The discriminant of the characteristic polynomial is a symmetric polynomial, therefore it can be written in terms of the elementary symmetric polynomials, which in turn can be written in terms of the entries of the matrix.

But since the discriminant is a polynomial, it only has finitely many roots. Therefore for $\epsilon > 0$, by changing the entries of the matrix less than $\epsilon / n$ we can find a new matrix $B$ such that $|B - A| < \epsilon$ and the discriminant is not zero.

The discriminant being not zero means $B$ has distinct eigenvalues, thus has a basis of eigenvectors. Therefore, it is diagonalizable.

Thus the set of diagonalizable matrices is dense in the set of matrices with respect to that metric.

  • 2
    Theorems 2.6--2.8 at http://www.math.uconn.edu/~kconrad/math316s08/univid.pdf are related to your question. They don't exactly address it but gives a reason such a theme is useful.2012-02-10
  • 0
    Your paragraph starting with "Then somehow..." seems like a pretty clear proof, provided you know some things about the nature of polynomials. Maybe that's what you need to work on?2012-02-10
  • 0
    Updated with roughly the solution provided in class. With some errors possibly, but the idea is there.2012-02-12
  • 0
    @KCd This file is no longer available. :-((2012-11-02
  • 1
    @vesszabo: Look at http://www.math.uconn.edu/~kconrad/blurbs/linmultialg/univid.pdf.2012-11-03
  • 0
    @KCd Many thanks! Your idea about identities is interesting and useful.2012-11-04
  • 0
    @zrbecker: another proof is sketcjed here http://math.stackexchange.com/q/3979152016-08-19

3 Answers 3

21

I cannot really follow the reasoning you are hinting in your question, but here's my take:

To talk about density you need a topology. Since $M_n(\mathbb{C})$, the space of complex $n\times n$ matrices is finite-dimensional, a very natural notion of convergence is entry-wise; so we can consider the metric $$ d(A,B)=\max\{ |A_{kj}-B_{kj}|\ : k,j=1,\ldots,n\}, \ \ \ A,B\in M_n(\mathbb{C}). $$ It is not hard to check that for any matrix $C$, $$ d(CA,CB)\leq d(A,B)\,\sum_{k,j=1}^n |C_{kj}|, $$ and the same inequality holds for multiplication on the right (this will be used in the last inequality below).

Now take any $A\in M_n(\mathbb{C})$. Let $J$ be its Jordan canonical form; then there exists a non-singular matrix $S$ such that $J=SAS^{-1}$. Fix $\varepsilon>0$. Let $$ m=\left(\sum_{k,j=1}^n |S_{kj}|\right)\,\left(\sum_{k,j=1}^n |(S^{-1})_{kj}|\right) $$

Now, the matrix $J$ is upper triangular, so its eigenvalues (which are those of $A$) are the diagonal entries. Let $J'$ be the matrix obtained from $J$ by perturbing the diagonal entries of $J$ by less than $\varepsilon/m$ in such a way that all the diagonal entries of $J'$ are distinct.

But now $J'$ is diagonalizable, since it has $n$ distinct eigenvalues. And $d(J,J')<\varepsilon/m$. Then $S^{-1}J'S$ is diagonalizable and $$ d(S^{-1}J'S,A)=d(S^{-1}J'S,S^{-1}JS)\leq m\,d(J',J)<\varepsilon. $$

  • 2
    Just to keep things simple, I will point out that the proof can be broken into two simple to understand distinct facts: Part 1: matrix multiplication and inversion are continuous. Part 2: The set of upper triangular matrices with distinct diagonal entries is dense in the set of upper triangular matrices...2012-02-11
  • 1
    +1 for emphasizing "to talk about density you need a topology." I can't really follow the asker's reasoning either, but I'd guess that at least some of the asker's confusion comes from the fact that the metric is not just a tool the asker's instructor used to *prove* the statement, but part of the structure that defined the *meaning* of the statement.2012-02-11
  • 1
    Thanks this clarifies a lot. I am not sure if my professor used Jordan form, but I do remember him mentioning something about it. We haven't proved that we can do it yet, I think, but he may have used it. Your explanation makes sense to me though. Thanks.2012-02-11
  • 0
    Thanks, leslie. I guess that the confusion at an elementary level comes from the fact that one usually takes the topology in $\mathbb{R}$ for granted, and so it requires a leap in maturity to recognize that topologies, however "natural" they might be, are arbitrary.2012-02-11
  • 0
    @Seatraced: you are welcome. It wouldn't surprise me that there are better proofs of this fact , but this one is the first one that came to mind.2012-02-11
  • 0
    In the definition of $m$ there is a typo, if I'm understand well, $(S^{-1})_{kj}$ is in it.2012-11-02
  • 0
    Yes, I'll write it that way as it is probably clearer.2012-11-02
  • 0
    Thanks:-) Your proof is very elegant.2012-11-02
15

Let $X$ be the set of diagonalizable matrices in $M_n(\mathbb C)$, and $Y$ the set of those matrices in $M_n(\mathbb C)$ which have $n$ distinct eigenvalues.

As $Y\subset X\subset M_n(\mathbb C)$, it suffices to show that $Y$ is dense in $M_n(\mathbb C)$.

Proof 1. Let $A$ be in $M_n(\mathbb C)$, and let $U$ be a neighborhood of $A$ in $M_n(\mathbb C)$. It suffices to check that $U$ intersects $Y$.

As $A$ is similar to a triangular matrix, we can assume that $A$ is triangular.

The diagonal entries of a triangular matrix being its eigenvalues, there is a diagonal matrix $D$ such that $A+D$ is in $U\cap Y$. QED

Proof 2. If you know the notion of discriminant of a univariate polynomial, you can argue as follows.

The discriminant $d(A)$ of the characteristic polynomial of $A$ being a nonzero polynomial, with complex (in fact integer) coefficients, in the entries of $A$, the set $$ Y=M_n(\mathbb C)\setminus d^{-1}(0) $$ is dense in $M_n(\mathbb C)$. QED

  • 0
    For Proof 1, why do I know such a D exists? Proof 2 actually seems like what my professor was doing. Although I am not sure why we know $Y$ is dense in $M_n(\mathbb{C})$.2012-02-11
  • 0
    Dear @Setraced: For Proof 2: If $f:\mathbb C^n\to\mathbb C$ is a nonzero polynomial function, then the interior of $Z:=f^{-1}(0)$ is empty. Indeed, let $a$ be in $Z$, and let $b$ be in $\mathbb C^n\setminus Z$. The restriction of $f$ to the line through $a$ and $b$, being a nonzero polynomial function, has only finitely many zeros.2012-02-11
  • 1
    Dear @Danikar: Having realized that my previous comment about Proof 1 was badly written, I deleted it and replaced it by the following observation. If $A\in M_n(\mathbb C)$ is triangular, then there are only finitely many values of $z\in\mathbb C$ for which $$A+z\ \text{diag}(1,2,\dots,n)$$ has repeated eigenvalues.2012-02-11