I have a fairly large array, a billion or so by 500,000 array. I need to calculate the singular value decomposition of this array. The problem is that my computer RAM will not be able to handle the whole matrix at once. I need an incremental approach of calculating the SVD. This would mean that I could take one or a couple or a couple hundred/thousand (not too much though) rows of data at one time, do what I need to do with those numbers, and then throw them away so that I can address memory toward getting the rest of the data.
People have posted a couple of papers on similar issues such as http://www.bradblock.com/Incremental_singular_value_decomposition_of_uncertain_data_with_missing_values.pdf and http://www.jofcis.com/publishedpapers/2012_8_8_3207_3214.pdf.
I am wondering if anyone has done any previous research or has any suggestions on how should go on approaching this? I really do need the FASTEST approach, without losing too much accuracy in the data.