Hi all,
Not sure if it's appropriate to post here since there's no pypy-user group
:)
My question is as title, we got memory leak on pypy process and process
will down when out of memory, only on production site.
Our simplified environment as below:
* OS: Centos 6
* pypy-2.3.1

objgraph is seems the only profiling library we can use in this env, and
only with its partial function of printing all current objects in memory
instead of any further info such as references (.getrefcount not
implemented).
It turns out we can only see lots of "int", "str", "list" objects seems
leaking rather than knowing who are using them or whom they are using. :(

Our constraint are
* it's hard to change production python runtime since it might affect to
our users
* we cannot reproduce on other environment

"pmap" produced data only shows memory growing in a [anon] block.

Please advise if there's other tool/methodologies to attack this problem,
thanks a lot in advance :)

Best Regards,
Jim(洪懷謙)
_______________________________________________
pypy-dev mailing list
pypy-dev@python.org
https://mail.python.org/mailman/listinfo/pypy-dev

Reply via email to