On Thu, Dec 22, 2011 at 10:24 PM, Colin Kern <[email protected]> wrote:
> Hi all,
>
> I have a program that uses the multiprocessing package to run a set of
> jobs. When I run it using the standard python interpreter on my
> computer, top shows 8 threads each using about 40M of memory, which
> remains fairly steady. When I use pypy, both 1.6 and 1.7, the 8
> threads almost immediately show using 90-100M of memory, and then that
> continues to climb as the program runs. Each job runs a lot faster in
> pypy, but usually before all the jobs are done, the memory on the
> system is exhausted and swapping starts, which brings the execution
> speed to a crawl. Is this something anyone else has experienced?
>
> Thanks,
> Colin
> _______________________________________________
> pypy-dev mailing list
> [email protected]
> http://mail.python.org/mailman/listinfo/pypy-dev

Hi Colin.

Thanks for the bug report, but we can't really help you without seeing
the code. There has been some issues like this in the past, however
most of them has been fixed, as far as we know. If you can isolate a
preferably small example, we would be happy to help you.

Cheers,
fijal
_______________________________________________
pypy-dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/pypy-dev

Reply via email to