After this previous question, we want to perform a numerical approximation to the singular value decomposition $\mathbf A=\mathbf U\mathbf \Sigma\mathbf V^\top$. But we can operate only with matrix individual elements such as $a_{i,j}$ , $u_{n,k}$ , $v_{m,\ell}$ etc. (like we do in most programming languages). What shall be our step by step algorithm for performing SVD?
What is numerical step by step logic of SVD (singular value decomposition)?
3 Answers
Hmm, I do not quite understand what you mean with "step-by-step algorithm", so this answer might be useless, anyway.
Before the much complicated, elaborated (and also optimized) approaches to SVD, as detailed in @lhf's wikipedia link, there is a conceptual simple method: jacobi-rotation to principal-components-position along rows and the along columns (which of course must be iterated until accepted convergence like all other methods)
That rotations can be implemented very easily and are in general numerically stable: a pair of rows are rotated by an angle for which the criterion can agayin easily be determined.(requiring not more that the sums of squares of the current a_i's and a_j's under iteration). Having accumulated the rotation-criteria in the then unitary ("rotation") matrix U the same process is performed over the columns, giving another rotationmatrix V. (If the original matrix M is symmetric, U and V are transposes/inverses of each other and we have the diagonalization of M).
With current speed of computers, one can handle matrices of some dozen rows/columns in a second; a 80x100 -matrix M needed 2 seconds to be computed. Using the implementation in my MatMate-program I just formulate:
n,m = 80,100 M = randomn(n,m) // generate a random matrix of n rows, m columns U = gettrans(M ',"pca") ' // get the row-rotation matrix for rowwise PC-position V = gettrans(M ,"pca") // get the col-rotation matrix for colwise PC-position D = U * M * V // get the quasidiagonal matrix D
This needed at most two seconds(guessed).
The "gettrans"-procedure does simply pairwise rotations of all rows (resp cols) and iterates until convergence, keeping track of the rotations in a matrix T which is then returned as result of the procedure.
As said: the procedure is in no way optimal in time and space consumption; the time consumption is at least cubic with the size; with matrix-size of 200x100 I needed one minute for the same computation using a randommatrix M, so it advised to use it only for such small matrices. And for instance may not converge fast if the rowwise or columnwise PC-positions are not unique because of equality of eigenvalues in M*M' or M'*M (the apostrophe meaning transposition) , and thus should be modified and sophisticated to cover also the "unfriendly" cases. But it is -in my view- conceptionally straightforward and very simple to implement as a starter (and also numerically stable), so is -again in my view- a good example for the self-study of the method.
[Update:] A time-saving precondition as mentioned in the first of the two papers linked by @J.M. (here again) can easily be inserted into the MatMate-procedure:
r,c = 100,200 M = randomn(r,c) // generate a random matrix of n rows, m columns // precondition Q = gettrans(M, "drei") // get the col-rotation matrix for lower-triangular position // this is also QR-decomposition M1 = (M*Q) [1..r.1..r] // r is the upper-limit for the rank; // for SVD use submatrix only // SVD on the preconditioned matrix U = gettrans(M1 ,"pca") // get the col-rotation matrix for colwise PC-position V = gettrans(M1' ,"pca")' // get the row-rotation matrix for rowwise PC-position D = V * M1 * U // get the quasidiagonal matrix D of lower rank // if the full solution for the roation-matrices is needed, proceed as follows: // to get the full svd, matrices Q and U must be combined, but are of different size, // so U must be extended and the new diagonal-elements must be set to 1 U1 = einh(c) // 200x200-unit-matrix U1[1..r,1..r] = U // insert 100x100-rotation-matrix U = Q*U1 // get full 200x200 SVD-rotation matrix
Time without preconditioning: ~ 60 secs
Time with preconditioning: ~ 8 secs
[/update]
-
0Grate answer... finding how [MatMate](http://go.helms-net.de/sw/matmate/) works was fun...=) But it works!=) – 2011-10-26
If you absolutely, genuinely, really, truly need to implement the singular value decomposition yourself (which I have already advised you not to do based on my reading of you), you will want to see the ALGOL code in Golub and Reinsch's classic paper. (Alternatively, see the EISPACK implementation.) There have been a lot of improvements to the basic two-part algorithm (bidiagonalize and apply a modified QR algorithm to the resulting bidiagonal matrix) since then, but not knowing your background in numerical linear algebra matters, I'll refrain from mentioning them for the time being.
There is no finite algorithm for SVD, only numerical approximations. See http://en.wikipedia.org/wiki/Singular_value_decomposition#Calculating_the_SVD.
-
0Maybe if I give an inkling of how complicated the actual business is, I can scare the OP... :) – 2011-10-25