On 5/13/07, Dave P. Novakovic <[EMAIL PROTECTED]> wrote:

They are very large numbers indeed. Thanks for giving me a wake up call.
Currently my data is represented as vectors in a vectorset, a typical
sparse representation.

I reduced the problem significantly by removing lots of noise. I'm
basically recording traces of a terms occurrence throughout a corpus
and doing an analysis of the eigenvectors.

I reduced my matrix to  4863 x 4863 by filtering the original corpus.
Now when I attempt svd, I'm finding a memory error in the svd routine.
Is there a hard upper limit of the size of a matrix for these
calculations?


I get the same error here with linalg.svd(eye(5000)), and the memory is
indeed gone. Hmm, I think it should work, although it is sure pushing the
limits of what I've got: linalg.svd(eye(1000)) works fine. I think 4GB
should be enough if your memory limits are set high enough.

Are you trying some sort of principal components analysis?

<snip>

Chuck
_______________________________________________
Numpy-discussion mailing list
[email protected]
http://projects.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to