Re: [Numpy-discussion] GPU implementation?

2007-06-03 Thread Dave P. Novakovic
This may be of interest, LLVM support in Mesa, and i believe there is work doing on with LLVM and python in the pypy camp. http://zrusin.blogspot.com/2007/05/mesa-and-llvm.html I just stumbled on this page, while this conversation was happening :) Dave On 6/2/07, Bob Lewis [EMAIL PROTECTED]

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Dave P. Novakovic
-packages/numpy/linalg/linalg.py, line 575, in svd vt = zeros((n, nvt), t) MemoryError Cheers Dave On 5/13/07, Anne Archibald [EMAIL PROTECTED] wrote: On 12/05/07, Dave P. Novakovic [EMAIL PROTECTED] wrote: core 2 duo with 4gb RAM. I've heard about iterative svd functions. I actually

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Dave P. Novakovic
Are you trying some sort of principal components analysis? PCA is indeed one part of the research I'm doing. Dave ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion

Re: [Numpy-discussion] very large matrices.

2007-05-13 Thread Dave P. Novakovic
to build a space around it.) Cheers Dave On 5/14/07, Charles R Harris [EMAIL PROTECTED] wrote: On 5/13/07, Dave P. Novakovic [EMAIL PROTECTED] wrote: Are you trying some sort of principal components analysis? PCA is indeed one part of the research I'm doing. I had the impression you

[Numpy-discussion] very large matrices.

2007-05-12 Thread Dave P. Novakovic
Hi, I have test data of about 75000 x 75000 dimensions. I need to do svd, or at least an eigen decomp on this data. from search suggests to me that the linalg functions in scipy and numpy don't work on sparse matrices. I can't even get empty((1,1),dtype=float) to work (memory errors, or