2
$\begingroup$

While finding A=LU for a matrix if zero is encountered in pivot position then row exchange is required.

However if PA=LU form is used then no row exchange is required and apparently this also requires less computation.

What I am not able to understand it that how finding a correct permutation matrix involves less efforts that doing a row exchange during A=LU process?

Edit:

Matrix PA will already have a form in which all rows are in correct order. what I am not able to understand is that why PA=LU computation is going to be better than A=LU computation?

  • 0
    Maybe you are referring to some stability problems in actual computations that are alleviated by a technique called *pivoting*. This technique corresponds to the $PA=LU$ factorization. In short, doing an appropriate row permutation at each step of the factorization algorithm reduces the amount of errors due to cancellation.2011-09-20

1 Answers 1

5

Note sure if I understand your point. The purpose of a permutation matrix is exactly to do the row exchange for you. So consider $\bf PA = LU$ the more general form of $\bf A = LU$ in that it also takes care of row exchanges when they are needed.

Let's recap for my own sake: By performing some row operations on $\bf A$ (a process called elimination), you want to end up with $\bf U$. You can represent each row operation through a separate matrix, so say you need two row operations, $\bf E_1$ followed by $\bf E_2$ before you get $\bf U$, then you have $\bf E_2E_1A = U \Rightarrow \bf L = (E_2E_1)^{-1} = E_2^{-1}E_1^{-1}$.

But the neat thing is that you don't need matrix inversion to find those inverses. Say for example $\bf A$ is $2 \times 2$ and $\bf E_1$ is the operation subtract 2 times row 1 from row 2, then $\bf E_1^{-1}$ is just add 2 times row 1 to row 2. In matrix language:

$ {\bf E_1} = \begin{pmatrix} 1 & 0 \\ -2 & 1 \end{pmatrix} \Rightarrow {\bf E_1^{-1} = \begin{pmatrix} 1 & 0 \\ 2 & 1 \end{pmatrix} } $

This translates into computational ease, because all you have to do is change the sign on the non-diagonal elements to get the inverse.

Now on to permutation matrices. A permutation matrix is just an identity matrix with two of its rows exchanged. And even more conveniently, its inverse is just itself (because you get the original matrix back once you reapply the row exchange a second time), i.e. $\bf PPA = IA = A \Rightarrow P^{-1} = P$. So if row exchanges are needed, we add the additional step and write $\bf PA = LU$. Once again, a computational cakewalk.