Hello all,

Great work on pypy! I've had good luck with pypy generally but on a program that loads a very large data set I am getting a GC related exception:

----
loading reads, on record 25000000
RPython traceback:
  File "translator_goal_targetpypystandalone.c", line 888, in entry_point
  File "interpreter_function.c", line 876, in funccall__star_1
  File "interpreter_function.c", line 905, in funccall__star_1
File "rpython_memory_gc_minimark.c", line 2490, in MiniMarkGC_collect_and_reserve File "rpython_memory_gc_minimark.c", line 2193, in MiniMarkGC_minor_collection File "rpython_memory_gc_minimark.c", line 4535, in MiniMarkGC_collect_oldrefs_to_nursery
  File "rpython_memory_gc_base.c", line 1761, in trace___trace_drag_out
File "rpython_memory_gc_minimarkpage.c", line 214, in ArenaCollection_malloc File "rpython_memory_gc_minimarkpage.c", line 536, in ArenaCollection_allocate_new_page File "rpython_memory_gc_minimarkpage.c", line 735, in ArenaCollection_allocate_new_arena
Fatal RPython error: MemoryError
Aborted
----

This is pypy 1.7.0 from ppa.launchpad.net on x86_64 Ubuntu 11.10. The data being loaded exceeds the size of physical memory, but there is plenty of swap space.

The same program works with cpython.

I wanted to try to force pypy to use a different GC, but couldn't figure out how to do that yet. Apparently you can't select the GC from pypy command line and my efforts to use translate.py with an alternate GC haven't worked so far.

Any suggestions appreciated.

Rich

_______________________________________________
pypy-dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/pypy-dev

Reply via email to