Re: [pypy-dev] RFC: draft idea for making for loops automatically close iterators
That’s a good point, as it means there’s probably no safe & portable way to ensure that kind of stuff. «Trying to collect» something doesn’t really fall short of an actual collection, I believe (finding referers is hard). But I believe iterclose() defined appropriately on derived iterators would solve that?.. > 21 окт. 2016 г., в 17:13, huboнаписал(а): > > Well I'm really shocked to find out what I thought was a "automatic close" is > really the ref-couting GC of CPython, means that a lot of my code breaks in > PyPy... > It really becomes a big problem after iterators heavily used in Python > nowadays. Some builtin functions like zip, map, filter return iterators in > Python 3 instead of lists in Python 2, means invisible bugs for code ported > from Python 2, like zip(my_generator(), my_other_generator()) may leave the > iterators open if exited from a for loop. Even in Python 2, functions in > itertools may create these bugs. > In CPython, this kind of code will work because of the ref-counting GC, so it > is not obvious in CPython, but they break in PyPy. > > I'm wondering since a ref-counting GC implemention is not possible for PyPy, > is it possible to hack on the for loop to make it "try to" collect the > generator? That may really save a lot of lives. If the generator is still > referenced after the for loop, it may be the programmer's fault for not > calling close(), but loop through a returned value is something different - > sometimes you even do not know if it is a generator. > > 2016-10-21 > hubo ___ pypy-dev mailing list pypy-dev@python.org https://mail.python.org/mailman/listinfo/pypy-dev
Re: [pypy-dev] Dontbug: A reversible debugger for PHP (similar in concept to RevDB for Python/PyPy)
Hi Maciej, Yes, Dontbug is built on top of RR. Mozilla/RR can be finicky at times but overall I had a very good experience with it. I developed Dontbug on Ubuntu 16.04 and I found RR to be pretty robust on that distro at least. I did encounter a serious regression once (I was on a bleeding edge commit) but it was addressed quickly after I filed a ticket. RR runs Travis tests on Ubuntu 14.04 so we can be sure about that distro also. I also see references to Fedora in RR documentation so I'm guessing RR should be good on that distribution also. The reason I mention all these distros is that your experience with RR will often depend on the specific Linux kernel version and the gdb debugger version you're using (and to a slightly lesser extend the specific distro). RR tends to use a lot of hairy/advanced features like ptrace, seccomp-bpf, CPU performance counters etc. and the internal implementation of these in the kernel tends to subtly change over time (or suffer bugs). So you can often run into problems on very recent distros. For instance there is currently an outstanding ticket for test failures on Ubuntu 16.10. But as mentioned, the developers tend to address these quickly. As long as you're on a mainstream non-bleeding edge distro, I would think that RR should work fine for you. The Mozilla folks use RR to debug Firefox which is a hugely complex application (as you can imagine) and this gives me confidence about the overall correctness of RR. Coming to PyPy I would suggest you try RR out again. As a start, try it out on Ubuntu 16.04. If you run into problems do post a ticket on the RR project. I also noticed some references to UndoDB usage on the PyPy project. How has your team's experience been with UndoDB+PyPy in general? It would interesting to learn about your experiences there... Thanks, Sidharth On Fri, Oct 21, 2016 at 10:58 AM, Maciej Fijalkowskiwrote: > Hi Sidharth > > I see dontbug is based on rr - I would like to know how well rr works > for you. We've tried using rr for pypy and it didn't work as > advertised. On the other hand it seems the project is moving fast, so > maybe it works these days > > On Thu, Oct 20, 2016 at 9:50 PM, Sidharth Kshatriya > wrote: > > Dear All, > > > > There have been some interesting blogs about RevDB a reversible debugger > for > > Python on the PyPy blog. > > > > I'd like to tell you about Dontbug, a reversible debugger for PHP that I > > recently released. Like RevDB, it allows you to debug forwards and > backwards > > -- but in PHP. > > > > See: > > https://github.com/sidkshatriya/dontbug > > > > For a short (1m35s) demo video: > > https://www.youtube.com/watch?v=DA76z77KtY0 > > > > Why am I talking about this in a PyPy mailing list :-) ? Firstly, > because I > > think reverse debuggers for dynamic languages are relatively rare -- so > its > > a good idea that we know about each other! Secondly, the fact that there > are > > more and more reversible debuggers for various languages every year means > > that reverse debugging is definitely entering the mainstream. We could > be at > > an inflexion point here! > > > > Hope you guys find Dontbug interesting! > > > > Thanks, > > > > Sidharth > > > > > > > > ___ > > pypy-dev mailing list > > pypy-dev@python.org > > https://mail.python.org/mailman/listinfo/pypy-dev > > > ___ pypy-dev mailing list pypy-dev@python.org https://mail.python.org/mailman/listinfo/pypy-dev
Re: [pypy-dev] RFC: draft idea for making for loops automatically close iterators
Well I'm really shocked to find out what I thought was a "automatic close" is really the ref-couting GC of CPython, means that a lot of my code breaks in PyPy... It really becomes a big problem after iterators heavily used in Python nowadays. Some builtin functions like zip, map, filter return iterators in Python 3 instead of lists in Python 2, means invisible bugs for code ported from Python 2, like zip(my_generator(), my_other_generator()) may leave the iterators open if exited from a for loop. Even in Python 2, functions in itertools may create these bugs. In CPython, this kind of code will work because of the ref-counting GC, so it is not obvious in CPython, but they break in PyPy. I'm wondering since a ref-counting GC implemention is not possible for PyPy, is it possible to hack on the for loop to make it "try to" collect the generator? That may really save a lot of lives. If the generator is still referenced after the for loop, it may be the programmer's fault for not calling close(), but loop through a returned value is something different - sometimes you even do not know if it is a generator. 2016-10-21 hubo 发件人:Armin Rigo发送时间:2016-10-18 16:01 主题:Re: [pypy-dev] RFC: draft idea for making for loops automatically close iterators 收件人:"Nathaniel Smith" 抄送:"PyPy Developer Mailing List" Hi, On 17 October 2016 at 10:08, Nathaniel Smith wrote: > thought I'd send around a draft to see what you think. (E.g., would > this be something that makes your life easier?) As a general rule, PyPy's GC behavior is similar to CPython's if we tweak the program to start a chain of references at a self-referential object. So for example, consider that the outermost loop of a program takes the objects like the async generators, and stores them inside such an object: class A: def __init__(self, ref): self.ref = ref self.myself = self and then immediately forget that A instance. Then both this A instance and everything it refers to is kept alive until the next cyclic GC occurs. PyPy just always exhibits that behavior instead of only when you start with reference cycles. So the real issue should not be "how to so something that will make PyPy happy", or not only---it should be "how to do something that will make CPython happy even in case of reference cycles". If you don't, then arguably CPython is slightly broken. Yes, anything that can reduce file descriptor leaks in Python sounds good to me. A bientôt, Armin. ___ pypy-dev mailing list pypy-dev@python.org https://mail.python.org/mailman/listinfo/pypy-dev ___ pypy-dev mailing list pypy-dev@python.org https://mail.python.org/mailman/listinfo/pypy-dev