On 5/12/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote:

Hi,

I have test data of about 75000 x 75000 dimensions. I need to do svd,
or at least an eigen decomp on this data. from search suggests to me
that the linalg functions in scipy and numpy don't work on sparse
matrices.

I can't even get  empty((10000,10000),dtype=float) to work (memory
errors, or too many dims), I'm starting to feel like I'm in a bit of
trouble here :)


Umm, big.

What do people use to do large svd's? I'm not adverse to using another
lib or wrapping something.


What sort of machine do you have? There are column iterative methods for svd
that resemble Gram-Schmidt orthogonalization that could probably be adapted
to work over the array one column at a time. Are your arrays actually
sparse? Do you only need a few eigenvalues? Are you doing least squares? A
more precise description of the problem might lead to alternative , less
demanding, approaches.

Chuck
_______________________________________________
Numpy-discussion mailing list
[email protected]
http://projects.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to