[issue18553] os.isatty() is not Unix only
Senthil Kumaran added the comment: Georg: Thanks for spotting. I feel bad for the mistake. I shall correct it. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18553 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19044] getaddrinfo raises near-useless exception
Changes by Antoine Pitrou pit...@free.fr: -- nosy: +neologix ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19044 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19045] Make on Solaris 11 x64 with OracleStudio12.3 failed
New submission from Borut Podlipnik: # make CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Modules/python.o ./Modules/python.c CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/acceler.o Parser/acceler.c CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/grammar1.o Parser/grammar1.c Parser/grammar1.c, line 39: Warning: String literal converted to char* in initialization. 1 Warning(s) detected. CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/listnode.o Parser/listnode.c CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/node.o Parser/node.c CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/parser.o Parser/parser.c CC -c -g -DNDEBUG -O -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/parsetok.o Parser/parsetok.c Parser/parsetok.c, line 250: Error: Cannot assign void* to char*. 1 Error(s) detected. *** Error code 2 make: Fatal error: Command failed for target `Parser/parsetok.o' -- components: Build messages: 198042 nosy: podlipnik priority: normal severity: normal status: open title: Make on Solaris 11 x64 with OracleStudio12.3 failed type: compile error versions: Python 2.7 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19045 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18553] os.isatty() is not Unix only
Roundup Robot added the comment: New changeset 2b7f11ba871c by Senthil Kumaran in branch '3.3': Correcting the mistake in 14ba90816930 http://hg.python.org/cpython/rev/2b7f11ba871c New changeset e839e524a7d5 by Senthil Kumaran in branch 'default': Correcting the mistake in 678e3c0d2d99 http://hg.python.org/cpython/rev/e839e524a7d5 -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18553 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18553] os.isatty() is not Unix only
Senthil Kumaran added the comment: Fixed now. Ascertained myself by doing hg diff -r tip^ -U 10 on local commits before pushing. :-) -- status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18553 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19046] SystemError: ..\Objects\weakrefobject.c:903: bad argument to internal function
New submission from Michael Herrmann: I'm on 32 bit Python 2.7.3 and 64 bit Windows 7. I am working on a complex, multithreaded application which uses COM to communicate with other processes. My application uses regular expressions in a few (but not very many) places. An example re I am trying to match is 'quit(\\(.*\\))?', an example input would be 'quit()'. Nothing exciting. Very frustratingly, even though the regular expressions and inputs are the same, I spuriously get errors of the following form: Traceback (most recent call last): ... File re.pyc, line 137, in match SystemError: ..\Objects\weakrefobject.c:903: bad argument to internal function I have seen the bug many times on 2.7.3. I briefly tried to reproduce it on 2.7.5 and it did not occur. This may be because the bug only occurs spuriously and I was unlucky, or because the bug no longer exists in 2.7.5. My (unverified) hunch is that the bug is not in 2.7.5 anymore. I have a C unhandled exception handler installed in my application. What's interesting is that the bug frequently seems to occur together with an unhandled memory access violation (exception code C005). When this is the case, the Python interpreter hangs or crashes. I have not seen similar crashes in my application at other points; only when working with regular expressions and seeing the above stack trace. -- components: Regular Expressions, Windows messages: 198045 nosy: ezio.melotti, mherrmann.at, mrabarnett priority: normal severity: normal status: open title: SystemError: ..\Objects\weakrefobject.c:903: bad argument to internal function type: crash versions: Python 2.7 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19046 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19044] getaddrinfo raises near-useless exception
Changes by Balazs czv...@gmail.com: -- nosy: +balazs ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19044 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19047] Clarify weakref.finalize objects are kept alive automatically
New submission from Nick Coghlan: I was just looking at weakref.finalize objects trying to figure out if they solve a problem I'm thinking about (they do), and I couldn't figure out from the class documentation ([1]) whether or not I needed to take care of keeping the object returned from weakref.finalize alive. I don't (the module keeps finalizers alive automatically), but this critical piece of information is only mentioned in the example much further down in the documentation ([2]). [1] http://docs.python.org/dev/library/weakref#weakref.finalize [2] http://docs.python.org/dev/library/weakref#finalizer-objects The 3.4 What's New should also explicitly mention weakref.finalize as an alternative to __del__ methods (perhaps as a comment in the section on PEP 422, perhaps just in the weakref section) -- assignee: docs@python components: Documentation messages: 198046 nosy: docs@python, ncoghlan priority: normal severity: normal stage: needs patch status: open title: Clarify weakref.finalize objects are kept alive automatically type: enhancement versions: Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19047 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19047] Assorted weakref docs improvements
Nick Coghlan added the comment: Changing the title, since I spotted a few other problems as well. The weakref docs still refer to module globals being set to None during shutdown. That is no longer the case. There are also some assumptions about cycles with __del__ methods not being garbage collected, which is no longer true given PEP 442's improvements to module finalization. I also realised that the introduction of weakref.finalize and the elimination of the set globals to None hack gives Python relatively straightforward module destructors [1]: import weakref, sys mod = sys.modules[__name__] def del_this(): # implicit ref to the module globals from the function body weakref.finalize(mod, del_this) [1] https://mail.python.org/pipermail/import-sig/2013-September/000748.html -- title: Clarify weakref.finalize objects are kept alive automatically - Assorted weakref docs improvements ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19047 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
New submission from Antoine Pitrou: An itertools.tee object can cache an arbitrary number of objects (pointers), but its sys.getsizeof() value will always remain the same. -- components: Library (Lib) messages: 198048 nosy: pitrou, rhettinger, serhiy.storchaka priority: normal severity: normal stage: needs patch status: open title: itertools.tee doesn't have a __sizeof__ method type: behavior versions: Python 3.3, Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee isn't 64-bit compliant
New submission from Antoine Pitrou: itertools.tee uses ints rather than Py_ssize_t for indices. Using Py_ssize_t would not make the object bigger, since other fields will already be pointer-aligned. -- messages: 198049 nosy: pitrou, rhettinger, serhiy.storchaka priority: normal severity: normal stage: needs patch status: open title: itertools.tee isn't 64-bit compliant type: behavior versions: Python 3.3, Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: I find the implementation of itertools.tee a bit weird: why does teedataobject have to be a PyObject? It seems to complicate things and make them less optimal. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: Anywhere, here is a patch. -- keywords: +patch Added file: http://bugs.python.org/file31813/tee_sizeof.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: This is a duplicate of issue15475. -- superseder: - Correct __sizeof__ support for itertools ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: Hmm, no, I'm wrong, it's not a duplication. -- superseder: Correct __sizeof__ support for itertools - ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Changes by Serhiy Storchaka storch...@gmail.com: -- stage: needs patch - patch review ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: I'm not sure that sys.getsizeof() should recursively count all Python subobjects. That is why I had omitted tee() in my patch. sys.getsizeof([[]]) 36 sys.getsizeof([list(range(1))]) 36 -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: I'm not sure that sys.getsizeof() should recursively count all Python subobjects. Those are private subobjects. They are not visible to the programmer (except perhaps by calling __reduce__ or __setstate__). -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19046] SystemError: ..\Objects\weakrefobject.c:903: bad argument to internal function
Antoine Pitrou added the comment: Is there any reason why you don't switch to 2.7.5 and let your application run longer? (FWIW, this may be issue #16602) -- nosy: +pitrou ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19046 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19047] Assorted weakref docs improvements
Antoine Pitrou added the comment: Doc patches certainly welcome :-) -- nosy: +pitrou, sbt ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19047 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19043] Remove detailed listing of all versions from LICENSE, Doc/license.rst
Antoine Pitrou added the comment: Looks good to me. Python 1.5.2 (which I've never used) is the best Python anyway! -- nosy: +pitrou ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19043 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee isn't 64-bit compliant
Antoine Pitrou added the comment: Simple patch attached. -- keywords: +patch Added file: http://bugs.python.org/file31814/tee_64b.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Antoine Pitrou added the comment: Ideally, test specification should be separate from test execution. That is, it should be possible to keep the TestCase around (or whatever instantiates it, e.g. a factory) but get rid of its per-test-execution attributes. Perhaps restoring the original __dict__ contents would do the trick? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: Thy are visible by calling gc.get_referents(). High-level function can use this to count recursive size of objects. import sys, gc, itertools def gettotalsizeof(*args, seen=None): ... if seen is None: ... seen = {} ... sum = 0 ... for obj in args: ... if id(obj) not in seen: ... seen[id(obj)] = obj ... sum += sys.getsizeof(obj) ... sum += gettotalsizeof(*gc.get_referents(obj), seen=seen) ... return sum ... a, b = tee(range(1)) sum(next(a) for i in range(1000)) 499500 gettotalsizeof(a) 750 gettotalsizeof(b) 18734 -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Michael Foord added the comment: That would only be a shallow copy, so I'm not sure it's worth the effort. The test has the opportunity in the setUp to ensure that initial state is correct - so I would leave that per test. Obviously sharing state between tests is prima facie bad, but any framework reusing test suites is doing that already. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue12944] Accept arbitrary files for packaging's upload command
Changes by Florent Rougon frou...@users.sourceforge.net: -- nosy: +frougon ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue12944 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19050] crash while writing to a closed file descriptor
New submission from Daniel Rohlfing: This snippet let my interpreter crash immediately: import sys, io io.open(sys.stdout.fileno(), 'wb') fd.close() sys.stdout.write(now writing on stdout will cause a crash) That's happened on Python 2.7.5 (default, May 15 2013, 22:43:36) [MSC v.1500 32 bit (Intel)] on win32 Windows 7 SP1 x64 The same code let Python 3.3.2 throw an exception like this: Exception OSError: OSError(9, 'Bad file descriptor') in _io.TextIOWrapper name='stdout' mode='w' encoding='cp850' ignored -- components: Library (Lib) messages: 198063 nosy: damiro priority: normal severity: normal status: open title: crash while writing to a closed file descriptor type: crash versions: Python 2.7 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19050 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19050] crash while writing to a closed file descriptor
Daniel Rohlfing added the comment: the correct snippet is: fd = io.open(sys.stdout.fileno(), 'wb') fd.close() sys.stdout.write(now writing on stdout will cause a crash) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19050 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: Thy are visible by calling gc.get_referents(). That's totally besides the point. The point is that those objects are invisible in normal conditions, not that they can't be read using advanced implementation-dependent tricks. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Antoine Pitrou added the comment: That would only be a shallow copy, so I'm not sure it's worth the effort. The test has the opportunity in the setUp to ensure that initial state is correct - so I would leave that per test. I don't understand your objection. The concern is to get rid of old state after test execution. Obviously sharing state between tests is prima facie bad, but any framework reusing test suites is doing that already. What do you mean? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19047] Assorted weakref docs improvements
Nick Coghlan added the comment: I was initially thinking to myself I have a source checkout right here, I should just fix it, even though I'm at work and want to go home... and then I realised the potential scope of the fixes needed, given the longstanding misbehaviours these docs assume still exist :) (I also just realised the clone on my work system is ridiculously stale, so even updating it will likely take a while at this point) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19047 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Michael Foord added the comment: On 19 Sep 2013, at 14:06, Antoine Pitrou rep...@bugs.python.org wrote: Antoine Pitrou added the comment: That would only be a shallow copy, so I'm not sure it's worth the effort. The test has the opportunity in the setUp to ensure that initial state is correct - so I would leave that per test. I don't understand your objection. The concern is to get rid of old state after test execution. If the object state includes mutable objects then restoring the previous dictionary will just restore the same mutable (and likely mutated) object. To *properly* restore state you'd either need to deepcopy the dictionary or reinstantiate the testcase (not reuse it in other words). I'd rather leave it up to each test to ensure it reinitialises attributes in setUp than add further complexity that only does part of the job. Obviously sharing state between tests is prima facie bad, but any framework reusing test suites is doing that already. What do you mean? Any framework that is currently reusing test suites is re-using testcase instances. They are already sharing state between the runs. In fact messing with testcase dictionaries is a further possible cause of backwards incompatibility for those suites. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Antoine Pitrou added the comment: If the object state includes mutable objects then restoring the previous dictionary will just restore the same mutable (and likely mutated) object. I don't understand what you're talking about. Which mutable objects exactly? I'm talking about copying the dict before setUp. Obviously sharing state between tests is prima facie bad, but any framework reusing test suites is doing that already. What do you mean? Any framework that is currently reusing test suites is re-using testcase instances. They are already sharing state between the runs. They are not sharing it, since setUp will usually create the state anew. What we're talking about is cleaning up the state after tearDown is run, instead of waiting for the next setUp call. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: The point is that your patch breaks functions like gettotalsizeof(). It makes impossible to get a total size of general object. It will be better to add gettotalsizeof() to the stdlib (or add an optional parameter to sys.getsizeof() for recursive counting). -- nosy: +loewis ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Michael Foord added the comment: Ah right, my mistake. Before setUp there shouldn't be test state. (Although tests are free to do whatever they want in __init__ too and I've seen plenty of TestCase subclasses using __init__ when they should be using setUp.) Essentially though _cleanup is a backwards compatibility feature - and suites that need _cleanup as a public api are already living without testcase dict cleanup. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Antoine Pitrou added the comment: That said, I agree that the __dict__ proposal is a hack, but as is the current _removetestAtIndex mechanism. The only clean solution I can think of would be to have two separate classes: - a TestSpec which contains execution-independent data about a test case, and knows how to instantiate it - a TestCase that is used for the actual test execution, but isn't saved in the test suite Maybe it's possible to do this without any backwards compat problem by making TestSuite.__iter__ always return TestCases (but freshly-created ones, from the inner test specs). The main point of adaptation would be TestLoader.loadTestsFromTestCase(). -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: The point is that your patch breaks functions like gettotalsizeof(). It makes impossible to get a total size of general object. The thing is, Total size is generally meaningless. It can include things such as the object's type, or anything transitively referenced by the object, such as modules. It will be better to add gettotalsizeof() to the stdlib (or add an optional parameter to sys.getsizeof() for recursive counting). This patch has *nothing* to do with recursive counting. It counts the internal arrays of itertools.tee() as part of its memory size, which is reasonable and expected. It does *not* count memory recursively: it doesn't count the size of the itertools.tee()'s cached objects, for example. Recursive counting doesn't make sense with Python. Where do you stop counting? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue10042] functools.total_ordering fails to handle NotImplemented correctly
Drekin added the comment: Hello, I have run into this when I wanted to use OrderedEnum and the example in enum docs seemed too repetitive to me. It's nice to know that it's being worked on. -- nosy: +Drekin ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue10042 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
New submission from Serhiy Storchaka: There are some classes in gzip, bz2, lzma, and zipfile modules which implement buffered reader interface. They read chunks of data from underlied file object, decompress it, save in internal buffer, and provide common methods to read from this buffer. Maintaining of duplicated code is cumbersome and error prone. Proposed preliminary patch moves common code into new private class _io2._BufferedReaderMixin. If the proposition will be accepted in general, I'm going to write C version and move it into the io module. Perhaps even then merge it with io.BufferedIOBase. The idea is that all buffered reading functions (read(), read1(), readline(), peek(), etc) can be expressed in the term of one function which returns raw unbuffered data. Subclasses need define only one such function and will got all buffered reader interface. In case of mentioned above classes this functions reads and decompresses a chunk of data from underlied file. The HTTPResponse class perhaps will benefit too (issue19009). -- components: IO, Library (Lib) files: buffered_reader.diff keywords: patch messages: 198075 nosy: alanmcintyre, benjamin.peterson, nadeem.vawda, pitrou, serhiy.storchaka, stutzbach priority: normal severity: normal stage: patch review status: open title: Unify buffered readers type: enhancement versions: Python 3.4 Added file: http://bugs.python.org/file31815/buffered_reader.diff ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Michael Foord added the comment: Having TestLoader.loadTestsFromTestCase() return a lazy suite that defers testcase instantiation until iteration is a nice idea. Unfortunately the TestSuite.addTests api iterates over a suite to add new tests. i.e. the code that builds a TestSuite for module probably already iterates over the suites returned by TestLoader.loadTestsFromTestCase - so the change would need to be more pervasive. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
Serhiy Storchaka added the comment: Here are benchmark script and its results. -- Added file: http://bugs.python.org/file31816/read_bench.py ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
Changes by Serhiy Storchaka storch...@gmail.com: Added file: http://bugs.python.org/file31817/read_bench_cmp ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
Antoine Pitrou added the comment: See issue12053 for a more flexible primitive. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18553] os.isatty() is not Unix only
Dmi Baranov added the comment: I found a little difference in isatty implementaions: Windows Determines whether a file descriptor is associated with a character device [1] Unix Test whether a file descriptor refers to a terminal [2] So, we having a same behavior for pipes, but different for I/O redirection with pseudo-character devices: $ ./python -c import os;print(os.isatty(0)) /dev/null False e:\PCbuildpython_d.exe -c import os;print(os.isatty(0)) NUL True Other pseudo devices works simular: e:\PCbuildpython_d.exe -c import os;print(os.isatty(0)) CON True I having a snippet to fix that, should I open a new issue for patch? [1] http://msdn.microsoft.com/en-us/library/f4s0ddew.aspx [2] http://man7.org/linux/man-pages/man3/isatty.3.html -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18553 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19053] read1() from zipfile returns empty data
New submission from Serhiy Storchaka: Bzip2 is block-based compression with large (up to 800 Kb) size of block. It means that while decompressor not read enough data it can't produce any output (actually this issue exists for other compression algorithms too, but less frequent). And the read1() method which makes only one low-level read can return empty bytes when there is still data available. This issue was fixed for gzip, bz2, and lzma modules in issue15546, but left in the zipfile module. In zipfile this bug less catastrophic because other read functions do not use read1() directly, but use lower level internal function. -- assignee: serhiy.storchaka components: Library (Lib) messages: 198085 nosy: serhiy.storchaka priority: normal severity: normal stage: needs patch status: open title: read1() from zipfile returns empty data type: behavior versions: Python 3.3, Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19053 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
Changes by Antoine Pitrou pit...@free.fr: -- nosy: +orsenthil type: performance - enhancement versions: -Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3, Python 3.5 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue11798] Test cases not garbage collected after run
Antoine Pitrou added the comment: Unfortunately the TestSuite.addTests api iterates over a suite to add new tests. i.e. the code that builds a TestSuite for module probably already iterates over the suites returned by TestLoader.loadTestsFromTestCase - so the change would need to be more pervasive. addTests() could easily be tweaked to recognize that it gets passed a TestSuite, and special-case that. Also, TestCase objects could probably get an optional spec attribute pointing to their TestSpec. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue11798 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
Serhiy Storchaka added the comment: This primitive doesn't not well fit a case of compressed streams. A chunk of compressed data read from underlied file object can be uncompressed to unpredictable large data. We can't limit the size of buffer. Another point is that buffer interface is not very appropriate for Python implementation. And we want left as much Python code in gzip, bz2, lzma and zipfile as possible. Copying from bytes into buffer and back will just waste resources. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue12053] Add prefetch() for Buffered IO (experiment)
Changes by Serhiy Storchaka storch...@gmail.com: -- nosy: +serhiy.storchaka ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue12053 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
New submission from Joseph Warren: I will add as a disclaimer to this bug report that I am relatively new to both the http spec, and Python programming, so I may have completely misunderstood something. When I make a post request with some data (using libCurl), It sends the headers first, including one header, Expect: 100-continue and then waits for around a second, or until receiving a 100 Continue response, before sending the rest of the data. (I have attached WireShark capture data as html to show this, I've also attached a dump showing how Apache behaves given the same request). This means that when I make such a request to a CGI script, hosted using CGIHTTPServer, and the script tries to read in the data, (for instance, using cgi.FieldStorage() ). Then the CGI script takes around a second to parse the data, since it hasn't been sent yet. I currently have a work-around for this, which is to override CGIHTTPRequestHandler, and have it send a 100 Continue response before doing anything else. A dump showing this is attached, this makes the connection much faster. This is fine for my application, as all CGI requests will be made in the same way, However it is not a general solution. The W3C, when defining Expect Headers, state (taken from http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html); A server that does not understand or is unable to comply with any of the expectation values in the Expect field of a request MUST respond with appropriate error status. The server MUST respond with a 417 (Expectation Failed) status if any of the expectations cannot be met or, if there are other problems with the request, some other 4xx status. I would like to offer to implement handling of the Expect header in BaseHTTPServer, however I would like some feedback before doing this, and to check that the current behaviour is not intended for some reason. This change would make CGIHTTPServer much faster in my use case, and I suspect it would help other people. -- components: Library (Lib) files: Wireshark_Captures.html messages: 198084 nosy: Joseph.Warren priority: normal severity: normal status: open title: Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly. type: performance versions: Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3, Python 3.4, Python 3.5 Added file: http://bugs.python.org/file31818/Wireshark_Captures.html ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18553] os.isatty() is not Unix only
Senthil Kumaran added the comment: Hello Dmi, I having a snippet to fix that, should I open a new issue for patch? Please open a new issue. Thanks! -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18553 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
R. David Murray added the comment: See also issue 1346874. It seems that we do not currently handle 'continue' correctly in general. -- nosy: +r.david.murray ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
Serhiy Storchaka added the comment: This primitive doesn't not well fit a case of compressed streams. A chunk of compressed data read from underlied file object can be uncompressed to unpredictable large data. We can't limit the size of buffer. Another point is that buffer interface is not very appropriate for Python implementation. And we want left as much Python code in gzip, bz2, lzma and zipfile as possible. Copying from bytes into buffer and back will just waste resources. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19051] Unify buffered readers
Changes by Serhiy Storchaka storch...@gmail.com: -- Removed message: http://bugs.python.org/msg198083 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19051 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Amaury Forgeot d'Arc added the comment: I like the definition of __sizeof__ that was discussed some time ago: http://bugs.python.org/issue14520#msg157798 With that definition (do we have it somewhere in the docs, by the way?) The current code works gives the correct answer. -- nosy: +amaury.forgeotdarc ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: Isn't sys.getsizeof() a low-level debugging tool? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
Joseph Warren added the comment: Cheers, would you suggest I submit a patch then? Also what version of python should I do this against? I've been working with 2.7, but the issue still exists in 3.* and I can conveniently work against 3.2.3-7, and less conveniently work against other versions. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
R. David Murray added the comment: I think we can fix this in maintenance releases without breaking anyone's code, so a patch against both 2.7 and 3.3 would be ideal. A patch against 3.2 will probably apply cleanly to 3.3, so that should be fine as well. Thanks for working on this, and could you please submit a contributor agreement at your earliest convenience? It takes a few days for that to get processed. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee isn't 64-bit compliant
Serhiy Storchaka added the comment: I don't think we need Py_ssize_t for integers from 0 to 57. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19054] Descriptors howto
New submission from Marco Buttu: I think in the descriptor howto, at this point: class MyClass(object): x = RevealAccess(10, 'var x') y = 5 or the prompt should not have been, or there is a wrong indentation. Furthermore, in Python 3: http://docs.python.org/3/howto/descriptor.html we can remove the explicit derivation from `object` -- assignee: docs@python components: Documentation messages: 198096 nosy: docs@python, marco.buttu priority: normal severity: normal status: open title: Descriptors howto type: enhancement versions: Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3, Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19054 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19053] read1() from zipfile returns empty data
Changes by Serhiy Storchaka storch...@gmail.com: -- stage: needs patch - patch review ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19053 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19053] read1() from zipfile returns empty data
Serhiy Storchaka added the comment: Here is a patch. -- keywords: +patch Added file: http://bugs.python.org/file31819/zipfile_read1.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19053 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: Isn't sys.getsizeof() a low-level debugging tool? What would it help debug exactly? :-) I would hope it gives remotely useful information about the passed object. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19054] Descriptors howto
Changes by Marco Buttu marco.bu...@gmail.com: Added file: http://bugs.python.org/file31821/py2howto.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19054 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Amaury Forgeot d'Arc added the comment: getsizeof() is interesting only if it gives sensible results when used correctly, especially if you want to sum these values and get a global memory usage. One usage is to traverse objects through gc.get_referents(); in this case the definition above is correct. Now, are you suggesting to traverse objects differently? With dir(), or __dict__? (btw, this discussion explains why pypy still does not implement getsizeof()) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19054] Descriptors howto
Changes by Marco Buttu marco.bu...@gmail.com: -- keywords: +patch Added file: http://bugs.python.org/file31820/py3howto.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19054 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee uses int for indices
Antoine Pitrou added the comment: Ah, my bad, it seems I misunderstood the tee() implementation. So apparenly even index cannot get past 57. It's more of a consistency patch then, and I agree it isn't very important. -- priority: normal - low stage: needs patch - patch review title: itertools.tee isn't 64-bit compliant - itertools.tee uses int for indices type: behavior - enhancement versions: -Python 3.3 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: Well. itertools._tee is one Python object and itertools._tee_dataobject is another Python object. sys.getsizeof() gives you the memory usage of this objects separately. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: How do you stop walking your graph if it spans the whole graph of Python objects? We can stop at specific types of objects (for example types and modules). If sys.getsizeof() is only useful for people who know *already* how an object is implemented internally, then it's actually useless, because those people can just as well do the calculation themselves. It's why sys.getsizeof() is a low-level tool. We need high-level tool in the stdlib. Even imperfect recursive counting will be better than confusing for novices sys.getsizeof(). (By the way, OrderedDict.__sizeof__ already breaks the rule you are trying to impose) Yes, I know, and I think it is wrong. Here is improved version of gettotalsizeof(): def gettotalsizeof(*args, exclude_types=(type, type(sys))): seen = {} stack = [] for obj in args: if id(obj) not in seen: seen[id(obj)] = obj stack.append(obj) sum = 0 while stack: obj = stack.pop() sum += sys.getsizeof(obj) for obj in gc.get_referents(obj): if id(obj) not in seen and not isinstance(obj, exclude_types): seen[id(obj)] = obj stack.append(obj) return sum gettotalsizeof(sys) 206575 gettotalsizeof(gc) 2341 gettotalsizeof(sys.getsizeof) 60 gettotalsizeof(gettotalsizeof) 60854 class C: pass ... gettotalsizeof(C) 805 gettotalsizeof(C()) 28 -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: Here is why using get_referents() is stupid in the general case: class C: pass ... c = C() gc.get_referents(c) [class '__main__.C'] With your method, measuring c's memory consumption also includes the memory consumption of its type object. (and of course this is only a trivial example... one can only imagine what kind of mess it is with a non-trivial object) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: (By the way, OrderedDict.__sizeof__ already breaks the rule you are trying to impose) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: Well. itertools._tee is one Python object and itertools._tee_dataobject is another Python object. sys.getsizeof() gives you the memory usage of this objects separately. This is great... And how do I know that I need to use gc.get_referents() to get those objects in case I'm measuring the memory consumption of a teeobject (rather than, say, trusting __dict__, or simply trusting the getsizeof() output at face value)? If sys.getsizeof() is only useful for people who know *already* how an object is implemented internally, then it's actually useless, because those people can just as well do the calculation themselves. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee isn't 64-bit compliant
Antoine Pitrou added the comment: I don't think we need Py_ssize_t for integers from 0 to 57. IMO it's better to use Py_ssize_t there, since it can get combined with other Py_ssize_t values. Avoiding spurious conversions eliminates a common source of errors. We don't gain anything by keeping it as an int. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: It's why sys.getsizeof() is a low-level tool. We need high-level tool in the stdlib. Even imperfect recursive counting will be better than confusing for novices sys.getsizeof(). Ok, but I need to see a satisfying version of gettotalsizeof before I'm convinced (see below). Here is improved version of gettotalsizeof(): [...] gettotalsizeof(gettotalsizeof) 60854 Why that big? Does it make sense? What if say, a large object is shared between many small objects? Should it count towards the memory size of any of those small objects? What if that object is actually immortal (it is also a module global, for example)? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue3982] support .format for bytes
Augie Fackler added the comment: I'd like to put a nudge towards supporting the __mod__ interface on bytes - for Mercurial this is the single biggest impediment to even getting our testrunner working, much less starting the porting process. -- nosy: +durin42 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue3982 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Serhiy Storchaka added the comment: Optionally we can also not count objects which are referenced from outside of a graph of objects (this isn't so easy implement in Python). I.e. gettotalsizeof([1, 'abc', math.sqrt(22)], inner=True) will count only bare list and a square of 22, because 1 and 'abc' are interned. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Martin v. Löwis added the comment: The problem is that that definition isn't helpful. If we ever change itertools.tee to use non-PyObjects internally, s suddenly its sys.getsizeof() would have to return much larger numbers despite visible behaviour not having changed at all (and despite the memory overhead being actually lower). I see no problem with that. If the internal representation changes, nobody should be surprised if sizeof changes. I would hope it gives remotely useful information about the passed object. It certainly does: it reports the memory consumption of the object itself, not counting the memory of other objects. I proposed a precise definition of what an other object is. If you don't like it, please propose a different definition that still allows to automatically sum up the memory of a graph of objects. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: I see no problem with that. If the internal representation changes, nobody should be surprised if sizeof changes. Who is nobody? Users aren't aware of internal representation changes. It sounds like you want sys.getsizeof() to be a tool for language implementors anyway. I proposed a precise definition of what an other object is. If you don't like it, please propose a different definition that still allows to automatically sum up the memory of a graph of objects. What is the use case for summing up the memory of a graph of objects? How do you stop walking your graph if it spans the whole graph of Python objects? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: Optionally we can also not count objects which are referenced from outside of a graph of objects (this isn't so easy implement in Python). I.e. gettotalsizeof([1, 'abc', math.sqrt(22)], inner=True) will count only bare list and a square of 22, because 1 and 'abc' are interned. That's only part of the equation. What if I have an object which references, for example, a logging.Logger? Loggers are actually eternal (they live in a global dictionary somewhere in the logging module), but gettotalsizeof() will still count it. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: getsizeof() is interesting only if it gives sensible results when used correctly, especially if you want to sum these values and get a global memory usage. Getting a global memory usage isn't a correct use of getsizeof(), though, because it totally ignores the memory allocation overhead (not to mention fragmentation, or any memory areas that may have been allocated without being accounted for by __sizeof__). If you want global Python memory usage, use sys._debugmallocstats(), not sys.getsizeof(). One usage is to traverse objects through gc.get_referents(); in this case the definition above is correct. What are the intended semantics? get_referents() can give you references you didn't expect, such as type objects, module objects... Now, are you suggesting to traverse objects differently? With dir(), or __dict__? sys.getsizeof() gives you the memory usage of a given Python object, it doesn't guarantee that traversing objects will give you the right answer for any question. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee isn't 64-bit compliant
Antoine Pitrou added the comment: I doubt we should change this. Currently itertools.tee() will break if more than 2**31 elements are saved in the inner structures. I certainly think it should be fixed, rather than bite someone by surprise one day. However in any case you forgot about tee_reduce(). Will take a look. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19049] itertools.tee isn't 64-bit compliant
Serhiy Storchaka added the comment: We gain stability. I doubt we should change this. However in any case you forgot about tee_reduce(). -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19049 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
Joseph Warren added the comment: Issue1346874 seems similar, but is talking about a failure to handle 100 Continue responses sent by a server to the client, this issue is in server code, and is a failure of BaseHttpServer to send 100 Continue responses to a client which expects them. Please tell me if I'm mistaken/not making sense. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
R. David Murray added the comment: You are making perfect sense. My point in referencing that issue was that you are not crazy, we do indeed not handle continue correctly ;) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19048] itertools.tee doesn't have a __sizeof__ method
Antoine Pitrou added the comment: I like the definition of __sizeof__ that was discussed some time ago: http://bugs.python.org/issue14520#msg157798 The problem is that that definition isn't helpful. If we ever change itertools.tee to use non-PyObjects internally, suddenly its sys.getsizeof() would have to return much larger numbers despite visible behaviour not having changed at all (and despite the memory overhead being actually lower). And gc.get_referents() is really a low-level debugging tool, certainly not a reflection API (inspect would serve that role). -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19048 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue15751] Support subinterpreters in the GIL state API
Antoine Pitrou added the comment: I wanted to set issue10915 as duplicate but there is actually a tentative patch there. Unfortunately the discussions are now split apart... -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue15751 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue15751] Support subinterpreters in the GIL state API
Changes by STINNER Victor victor.stin...@gmail.com: -- nosy: +haypo ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue15751 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue10915] Make the PyGILState API compatible with multiple interpreters
Changes by STINNER Victor victor.stin...@gmail.com: -- nosy: +haypo ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue10915 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
New submission from Jason Stumpf: re.match('(a|ab)*',('aba')).group(0) 'a' According to the documentation, the * should match as many repetitions as possible. 2 are possible, it matches 1. Reversing the order of the operands of | changes the behaviour. re.match('(ab|a)*',('aba')).group(0) 'aba' -- messages: 198116 nosy: Jason.Stumpf priority: normal severity: normal status: open title: Regular expressions: * does not match as many repetitions as possible. type: behavior versions: Python 2.7 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Changes by Jason Stumpf jstu...@google.com: -- components: +Regular Expressions nosy: +ezio.melotti, mrabarnett ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Changes by David Benbennick dbenb...@gmail.com: -- nosy: +dbenbenn ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
Joseph Warren added the comment: Have downloaded an up to date version of Python to develop with, and found that this issue had been fixed in it. This seems to have been done in changeset: 65028:7add45bcc9c6 changeset: 65028:7add45bcc9c6 user:Senthil Kumaran orsent...@gmail.com date:Thu Sep 30 06:09:18 2010 + summary: Issue1491 - BaseHTTPServer incorrectly implements response code 100 This change does not seem to appear in 2.7 , and could be converted to work with this, however failing this, this issue can probably be closed. I appologise for not spotting this earlier. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
Joseph Warren added the comment: Sorry, this change does appear in 2.7.* I was looking at the wrong mercurial Tag -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19052] Python's CGIHTTPServer does not handle Expect: 100-continue gracefully which results in some Post requests being handled slowly.
Changes by Joseph Warren hungryjoe.war...@gmail.com: -- resolution: - duplicate status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19052 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Changes by Serhiy Storchaka storch...@gmail.com: -- nosy: +serhiy.storchaka ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Changes by Serhiy Storchaka storch...@gmail.com: -- versions: +Python 3.3, Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
janzert added the comment: The documentation on the | operator in the re module pretty explicitly covers this. http://docs.python.org/2/library/re.html A|B, where A and B can be arbitrary REs, creates a regular expression that will match either A or B. An arbitrary number of REs can be separated by the '|' in this way. This can be used inside groups (see below) as well. As the target string is scanned, REs separated by '|' are tried from left to right. When one pattern completely matches, that branch is accepted. This means that once A matches, B will not be tested further, even if it would produce a longer overall match. In other words, the '|' operator is never greedy. -- nosy: +janzert ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Jason Stumpf added the comment: Even with the documentation to |, the documentation to * is wrong. re.match('(a|ab)*c',('abac')).group(0) 'abac' From the doc: In general, if a string p matches A and another string q matches B, the string pq will match AB. Since '(a|ab)*c' matches 'abac', and 'c' matches 'c', that means '(a|ab)*' matches 'aba'. It does so with 2 repetitions. Thus, in the example from my initial post, it was not matching with as many repetitions as possible. I think what you mean is that * attempts to match again after each match of the preceding regular expression. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Jason Stumpf added the comment: Sorry, that implication was backwards. I don't think I can prove from just the documentation that '(a|ab)*' can match 'aba' in certain contexts. If the docs said: * attempts to match again after each match of the preceding regular expression. I think it would describe the observed behaviour. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue19055] Regular expressions: * does not match as many repetitions as possible.
Matthew Barnett added the comment: The behaviour is correct. Here's a summary of what's happening:- First iteration of the repeated group: Try the first branch. Can match a. Second iteration of the repeated group: Try the first branch. Can't match a. Try the second branch. Can't match ab. Continue with the remainder of the pattern. Can't match c, therefore backtrack to the first iteration of the repeated group: Try the second branch. Can match ab. Second iteration of the repeated group: Try the first branch. Can match a. Third iteration of the repeated group: Try the first branch. Can't match a. Try the second branch. Can't match ab. Continue with the remainder of the pattern. Can match c. Reached the end of the pattern. It has matched abac. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue19055 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com