On Wed, Oct 19, 2011 at 12:57 PM, Antonio Cuni <anto.c...@gmail.com> wrote: > On 19/10/11 13:42, Antonio Cuni wrote: > >> I'm not sure to interpret your sentence correctly. >> Are you saying that you would still want a pypy+numpy+scipy, >> even if it ran things slower than CPython? May I ask why? > > ah sorry, I think I misunderstood your email. > > You would like pypy+numpy+scipy so that you could write fast > python-only algorithms and still use the existing libraries. I > suppose this is a perfectly reasonable usecase, and indeed > the current plan does not focus on this.
I want this too - well actually pypy+numpy+xxx where xxx uses bits of the numpy C API. I don't care if the numpy bits are a *bit* slower under PyPy than C Python - 100% compatibility is more important to me. > However, I'd like to underline that to write "fast python-only > algorithms", you most probably still need a fast numpy in the > way it is written right now (unless you want to write your > algorithms without using numpy at all). If we went to the > slow-but-scipy-compatible approach, any pure python > algorithm which interfaces with numpy arrays would be > terribly slow. I'd be happy with "close to numpy under C Python" speeds for my code using numpy under PyPy, with fast python-only bits. That covers quite a lot of use cases I would think, but if we'd get "terribly slow" for the numpy using bits that is less tempting. Depending on your value of terrible ;) Right now the PyPy micronumpy is far too limited to be of real use even where I'm using only the Python interface. e.g. there is no numpy.linalg module: https://bugs.pypy.org/issue915 Peter _______________________________________________ pypy-dev mailing list pypy-dev@python.org http://mail.python.org/mailman/listinfo/pypy-dev