Hi all,

I have a program that uses the multiprocessing package to run a set of
jobs. When I run it using the standard python interpreter on my
computer, top shows 8 threads each using about 40M of memory, which
remains fairly steady. When I use pypy, both 1.6 and 1.7, the 8
threads almost immediately show using 90-100M of memory, and then that
continues to climb as the program runs. Each job runs a lot faster in
pypy, but usually before all the jobs are done, the memory on the
system is exhausted and swapping starts, which brings the execution
speed to a crawl. Is this something anyone else has experienced?

Thanks,
Colin
_______________________________________________
pypy-dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/pypy-dev

Reply via email to