9
$\begingroup$

What are easy and quick ways to verify determinant, minimal polynomial, characteristic polynomial, eigenvalues, eigenvectors after calculating them?

So if I calculated determinant, minimal polynomial, characteristic polynomial, eigenvalues, eigenvectors, what are ways to be sure that I didn't do a major mistake? I don't want to verify my solutions all the way through, I just want a quick way which gives me that it is highly likely that the calculated determinant is right etc.


Let $A$ be a matrix $A \in \operatorname{Mat}(n, \mathbb{C})$,

let $\det(A)$ be the determinant of matrix $A$,

let $v_1, v_2, ..., v_k$ be eigenvectors of matrix $A$,

let $\lambda_1, \lambda_2, ..., \lambda_n$ be eigenvalues of matrix $A$,

let $\chi_A(t) = t^n + a_{n-1}t^{n-1}+\cdots + a_0 = (t-\lambda_1)\cdots(t-\lambda_n)$ be the characteristic polynomial of matrix $A$,

let $\mu_A(t)$ be the minimal polynomial of matrix $A$.


Verifications suggested so far:

eigenvectors / eigenvalues

  • $\det(A) = \lambda_1^{m_1} \lambda_2^{m_2} \cdots \lambda_n^{m_l}$ where $m_i$ is the multiplicity of the corresponding eigenvalue
  • $a_0 = (-1)^n\lambda_1\cdots\lambda_n$
  • eigenvectors can be verified by multiplying with the matrix; the eigenvalues can be verified at the same time; i.e. $A v_i = \lambda_i v_i$

determinant

  • $\det(A) = \lambda_1^{m_1} \lambda_2^{m_2} \cdots \lambda_l^{m_l}$ where $m_i$ is the multiplicity of the corresponding eigenvalue

characteristic / minimal polynomial

  • $a_0 = (-1)^n\lambda_1\cdots\lambda_n$
  • $\mu_A(A) = 0$ and $\chi_A(A) = 0$
  • $\mu_A(t) \mid \chi_A(t)$
  • 5
    The product of the eigenvalues (counting multiplicity) equals the determinant. The minimal polynomial should divide the characteristic polynomial. The constant term of the characteristic polynomial is $(-1)^n$ times the determinant; the coefficient of the term of degree $n-1$ is minus the sum of the eigenvalues (which is also the trace). That sort of thing?2012-02-03
  • 0
    Interesting question! Here's a paper which investigates complexity of computing and verifying the characteristic and minimal polynomials for an integer matrix: http://www.sciencedirect.com/science/article/pii/S03043975020040482012-02-03
  • 0
    ... and the matrix satisfies its minimal polynomial (and characteristic polynomial), i.e. if $P(\lambda) = a_0 + a_1 \lambda + \ldots a_n \lambda^n$ is the minimal or characteristic polynomial of matrix $A$, then $P(A) = a_0 I + a_1 A + \ldots a_n A^n = 0$.2012-02-03
  • 0
    Exactly that sort of thing! So multiplying all the eigenvalues gives the determinant? That's a good trick! Since I calculate the minimal polynomial the way that it should divide the characteristic polynomial that doesn't really help to detect mistakes in this calculation. What do you mean by the »constant term of the characteristic polynomial«; so if the polynomial is not written as linea factors? Didn't understand the last one yet..2012-02-03
  • 2
    @meinzlein: The characteristic polynomial is a polynomial. If you write it out as $x^n + a_{n-1}x^{n-1}+\cdots + a_0$, then the "constant term" is $a_0$. Since the characteristic polynomial equals $(x-\lambda_1)\cdots(a-\lambda_n)$, where $\lambda_1,\ldots,\lambda_n$ are the eigenvalues (in the algebraic closure of the underlying field), then we know that $a_0 = (-1)^n\lambda_1\cdots\lambda_n$ and $a_{n-1} = -(\lambda_1+\cdots+\lambda_n)$. The fact that the determinant equals the product of eigenvalues and the trace equals the sum follows by thinking about, say, the Jordan canonical form.2012-02-03
  • 0
    Thank you. Tried to summarize some »tricks« in the original post. Not sure if I've got all the indices right. Need to think about $a_{n-1} = -(\lambda_1+\cdots+\lambda_n)$.. Any more *tricks* or *verification tipps*?2012-02-03
  • 0
    @meinzlein: Look up [Vieta's formulas](http://en.wikipedia.org/wiki/Vieta%27s_formulas), or just think about when you get a term of degree $n-1$ when expanding $(x-\lambda_1)\cdots(x-\lambda_n)$: you must be selecting $n-1$ times the $x$, and only once the constant term.2012-02-04
  • 0
    One more trick: eigenvectors $u, v$ have different eigenvalues $\mu \neq \lambda$ are linear independent. Note that you can use the fact that $u$ and $v$ are linear independent iff $n \times 2$ matrix $[u, v]$ has rank 2.2013-12-22

1 Answers 1