0
$\begingroup$

I was reading this paper http://www.ersavas.com/bulut/projects/algorithms/smo_ge.pdf where it deals with solving linear equations with sparse matrices

So lets say we have an equations

Ax=b where A is a sparse matrix, then we can solve it using Gaussian elimination.

but first we can order A so that is has lesser band itself. So lets say I have a matrix

A = a11  0    a13  0     0    a22  a23  a24     a31  a32  a33  0     0    a42  0    a44 

Then if I permute using $ \pi=\{1, 2, 3, 4\} $

The linear system after symmetric permutation will have

A = a11  a13  0    0     a31  a33  a32  0     0    a23  a22  a24     0    0    a42  a44 

but if we actually change matrix A to the above form using permutation is the equation

Ax=b still valid

Also the paper mentions about reverse Cuthill-McKee Ordering, and states that it is efficient in comparision to just Cuthill-McKee Ordering for Gaussian elimination because of the way it is performed (left to right and down to bottom).

I didn't get what it meant by (left to right and down to bottom). Any insights?

1 Answers 1

1

If you do row operations to get from a matrix $A$ to a matrix $B$ then the solutions of $Bx=b$ will be exactly the same as the solutions of $Ax=b$. But in your example it appears that you are doing column operations as well. If you do column switches to get from a matrix $A$ to a matrix $B$, well, each column corresponds to an unknown, so the solutions of $Bx=b$ will be the solutions of $Ay=c$, where you get $y$ from $x$ and $c$ from $b$ by doing the same permutation of the entries that you did to the columns of $A$.

The easiest way to see this is to try it; see how the solutions of, say, $\pmatrix{1&0&0\cr0&2&0\cr0&0&3\cr}\pmatrix{x\cr y\cr z\cr}=\pmatrix{a\cr b\cr c\cr}$ compare to those of $\pmatrix{0&0&1\cr2&0&0\cr0&3&0\cr}\pmatrix{u\cr v\cr w\cr}=\pmatrix{d\cr e\cr f\cr}$

Concerning the left-to-right, down-to-bottom, I haven't a clue, though I wonder whether down-to-bottom isn't supposed to be top-to-bottom.