9
$\begingroup$

Let's say I have a $2 \times 2$ matrix (actually the structure tensor of a discrete image - I): $ \begin{bmatrix} \frac{\partial I}{\partial x}\frac{\partial I}{\partial x} & \frac{\partial I}{\partial x}\frac{\partial I}{\partial y} \\\ \frac{\partial I}{\partial y}\frac{\partial I}{\partial x} & \frac{\partial I}{\partial y}\frac{\partial I}{\partial y} \end{bmatrix}$

It has 2 properties:

  1. Symmetric.
  2. Positive Semidefinite.

Given those properties, what would be the easiest method to numerically compute its eigenvectors (orthogonal) and eigenvalues?

  • 0
    I assumed it is like comments. Fixed it. Thank you.2010-11-03

3 Answers 3

9

As a two-by-two matrix, applying Jacobi's method in fact gives the answer at once!

It is known that given a two-by-two symmetric matrix $\mathbf A$, one can construct an orthogonal matrix $\mathbf V=\bigl(\begin{smallmatrix}c&s\\\\-s&c\end{smallmatrix}\bigr)$ such that $\mathbf V^\top \mathbf A\mathbf V$ is diagonal, where the two numbers $c$ and $s$ satisfy $c^2+s^2=1$.

If the off-diagonal elements are not zero (why?), computing $c$ and $s$ can be done like so:

$\begin{align*} \tau&=\frac{a_{22}-a_{11}}{2a_{12}}\\ t&=\frac{\mathrm{sgn}(\tau)}{|\tau|+\sqrt{1+\tau^2}}\\ c&=\frac1{\sqrt{1+t^2}}\\ s&=ct \end{align*}$

from which the eigendecomposition easily follows.

(See Golub and Van Loan's excellent book for further details.)


Here is a Mathematica demonstration of Jacobi's method:

a = N[{{3, -1}, {-1, 2}}, 20];  tau = (a[[2, 2]] - a[[1, 1]])/a[[1, 2]]/2;  t = Sign[tau]/(Abs[tau] + Sqrt[1 + tau^2]);  {c, s} = {1, t}/Sqrt[1 + t^2] {0.85065080835203993218, 0.5257311121191336060}  {l1, l2} = {a[[1, 1]] - t a[[1, 2]], a[[2, 2]] + t a[[1, 2]]} {3.6180339887498948482, 1.3819660112501051518}  Eigenvalues[a] == {l1, l2} True  Eigenvectors[a] == {{c, -s}, {s, c}} True  a.{c, -s} == l1 {c, -s} True  a.{s, c} == l2 {s, c} True 
  • 0
    @Drazick, "gets one of the eigenvectors in the wrong direction (multiplied by 1)." - you are aware that if $\mathbf x$ is an eigenvector of $\mathbf A$, then any nonzero multiple of $\mathbf x$ is also an eigenvector?2015-05-01
6

Since you work with a $2\times2$ matrix, the corresponding characteristic polynomial is quadratic so the eigenvalues can be expressed in closed form in terms of the matrix elements. As soon as you get the eigenvalues all you have to do is to solve two systems of linear equations with two unknowns (i.e. the coordinates of the corresponding eigenvector in the standard basis). Again, everything is nice and explicit, no need to apply a fancy numerical method.

  • 3
    @Drazick: solving a two-by-two homogeneous linear system *is* straightforward.2010-11-02
5

Despite other answers, I thought it might benefit the impatient to see the explicit answer below.

Let $M = \left( \begin{array}{cc} a & b \\ b & c \end{array} \right),$ be the input matrix.

Define the discriminant: $\Delta = \sqrt{a^2+4 b^2-2 a c+c^2}$

Then, the eigenvalues of $M$ are given by:

$\lambda_1 = 0.5(a+c-\Delta)$ and $\lambda_2 = 0.5(a+c+\Delta)$

Now, you can find a matrix $V$ such that $M = V^{-1} \begin{pmatrix} \lambda_1 & 0\\ 0 & \lambda_2 \end{pmatrix}V.$

Mathematica says that the matrix $V$ is given by $ V = \begin{pmatrix} \frac{a-c-\Delta}{2b} & 1\\ \frac{a-c+\Delta}{2b} & 1 \end{pmatrix} $

If you are looking for orthogonal $V$, then the above calculations need some changes.

  • 0
    @Drazick: You can use Maxima instead of Mathematica for symbolic solving.2011-04-01