[issue21130] equivalent functools.partial instances should compare equal
Raymond Hettinger added the comment: I guess a partial is a binding, not a function. Viewed in that light, it makes sense to implement same logic as used for bound methods in method_richcompare. -- keywords: +easy stage: test needed - needs patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21130 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21111] PyLong_AsUnsignedLongAndOverflow does not exist
Hristo Venev added the comment: I will not add PyLong_AsUnsigned*AndOverflow in my code because I don't want my code to depend on the exact implementation of PyLong. Are you seriously calling a 50-line function feature? Anyway... I propose splitting the patch in two parts: - cleanup: the changes to Objects/longobject.c - feature: the changes to Include/longobject.h cleanup can be applied to 3.4.1 because it adds no new features and helps maintainability. Backwards compatibility will not be broken. feature can then be added in 3.5. Backwards compatibility should not be broken. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue2 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21111] PyLong_AsUnsignedLongAndOverflow does not exist
Mark Dickinson added the comment: Thanks for the patch! I left a couple of comments on Rietveld. The patch will also need docs and tests before it's ready to go in. There's a behaviour change in the patch which needs looking into: before the patch, PyLong_AsUnsignedLong will not call the target object's __int__ method (so it should raise a TypeError for a float or Decimal instance, for example). It looks to me as though the patch changes that; was that change intentional? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue2 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21144] ensurepip AssertionError: Multiple .dist-info directories
Changes by Samuel John pyt...@samueljohn.de: -- nosy: +samueljohn ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21144 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue20383] Add a keyword-only spec argument to types.ModuleType
Nick Coghlan added the comment: No, the attribute level arguments won't go away - __name__ deliberately differs from __spec__.name in some cases (notably in __main__), __path__ may be manipulated after the module is loaded, and __name and __file__ are both used too heavily within module code for it to be worth the hassle of deprecating them in favour of something else. I think Brett's push to simplify things as much as possible is good though - that's the main brake on creeping API complexity in the overall import system as we try to make the internals easier to comprehend and manipulate. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue20383 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21145] Add the @cached_property decorator
Omer Katz added the comment: The default implementation should simple: Once the property is accessed for the first time calculate the result and never calculate again. It's what both Django pip uses. You can add an optional TTL. There aren't any other features I can come up with that are common enough. If needed, one can inherit from cached_property and do whatever is needed. Eric, Why don't you think a C implementation is needed? It's a simple operation for sure but it is meant to increase runtime efficiency. 2014-04-05 1:11 GMT+04:00 Éric Araujo rep...@bugs.python.org: Éric Araujo added the comment: It could make sense to add clean, working recipes to e.g. the functools documentation. The cached_property in the wiki uses a TTL, other like Pyramid's reify decorator make properties that ensure the fget function is called only once per instance, and there may be subtly different variants out there. I don't know if there's a universally useful variant that should be added to the sdlib right now. (I don't think a C implementation is needed.) On a related note, the Python docs about desciptors may be missing entry-level explanations, as described here: http://me.veekun.com/blog/2012/05/23/python-faq-descriptors/ -- nosy: +eric.araujo, rhettinger ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21145 ___ -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21145 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21145] Add the @cached_property decorator
Omer Katz added the comment: I just checked and werkzeug uses the same implementation as Django pip. 2014-04-06 14:31 GMT+04:00 Omer Katz rep...@bugs.python.org: Omer Katz added the comment: The default implementation should simple: Once the property is accessed for the first time calculate the result and never calculate again. It's what both Django pip uses. You can add an optional TTL. There aren't any other features I can come up with that are common enough. If needed, one can inherit from cached_property and do whatever is needed. Eric, Why don't you think a C implementation is needed? It's a simple operation for sure but it is meant to increase runtime efficiency. 2014-04-05 1:11 GMT+04:00 Éric Araujo rep...@bugs.python.org: Éric Araujo added the comment: It could make sense to add clean, working recipes to e.g. the functools documentation. The cached_property in the wiki uses a TTL, other like Pyramid's reify decorator make properties that ensure the fget function is called only once per instance, and there may be subtly different variants out there. I don't know if there's a universally useful variant that should be added to the sdlib right now. (I don't think a C implementation is needed.) On a related note, the Python docs about desciptors may be missing entry-level explanations, as described here: http://me.veekun.com/blog/2012/05/23/python-faq-descriptors/ -- nosy: +eric.araujo, rhettinger ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21145 ___ -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21145 ___ -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21145 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue20397] distutils --record option does not validate existance of byte-compiled files
Changes by koobs koobs.free...@gmail.com: -- nosy: +koobs ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue20397 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21109] tarfile: Traversal attack vulnerability
Lars Gustäbel added the comment: In the past, our answer to these kinds of bug reports has always been that you must not extract an archive from an untrusted source without making sure that it has no malicious contents. And that tarfile conforms to the posix specifications with respect to extraction of files and pathname resolution. That's why we put this prominent warning in the documentation, and I think its advice still holds. I don't think that this issue should be marked as a release blocker, because the way tarfile currently works was a conscious decision, not an accident. tarfile does what it is designed to do: it processes a sequence of instructions to store a number of files in the filesystem. So the attack that is described by Daniel Garcia exploits neither a bug in tarfile nor a loophole in the tar archive format. A necessary condition for this attack to work is that the attacker has to trick the user into extracting the malicious archive first. After that, tarfile interprets the contained instructions word-for-word but still only within the boundaries defined by the user's privileges. I think it is obvious that it is potentially dangerous to extract tar archives we didn't create ourselves, because we actually give another person direct access to our filesystem. tarfile could mitigate some of the adverse effects, but this will not change the fact that it remains unsafe to use tarfile to a certain degree unless you use it with your own data or take reasonable precautions. Anyway, if we come to the conclusion that we want to eliminate this kind of attack, we must be aware that there is a lot more to do than that. tarfile as it is today is vulnerable to all known attacks against tar programs, and maybe even a few more that rely on its specific implementation. 1. Path traversal: The archive contains files names e.g. /etc/passwd or ../etc/passwd. 2. Symlink file attack: foo links to /etc/passwd. Another member named foo follows, its data overwrites the target file's data. 3. Symlink directory attack: foo links to /etc. The following member foo/passwd overwrites /etc/passwd. 4. Hardlink attack: Hardlink member foo links to /etc/passwd. tarfile creates the hardlink to /etc/passwd because it cannot find it inside the archive and falls back to the one in the filesystem. Another file named foo follows, its data overwrites /etc/passwd's data. 5. Permission manipulation: The archive contains an executable that is placed somewhere in PATH with its setuid flag set, so that an unprivileged user is able to gain root privileges. 6. Device file attacks: The archive contains a device node foo with the same major and minor numbers as an attached device. Another member named foo follows, its data is written to the device. 7. Huge zero file attacks: Bzip2 and lzma allow it to store huge blobs of repetetive data in tiny archives. When unpacked this data may fill up an entire filesystem. 8. Excessive memory usage: tarfile saves one TarInfo object per member it finds in an archive. If the archive contains several millions of members, this may fill up the memory. 9. Saving a huge sparse file: tarfile is unable to detect holes in sparse files and thus cannot store them efficiently. Archiving a huge sparse file can take very long and may lead to a very big archive that fills up the filesystem. Additionally, there are more issues mentioned in the GNU tar manual: https://www.gnu.org/software/tar/manual/html_node/Security.html In conclusion, I like to emphasize that tarfile is a library, it is no replacement for GNU tar. And as a library it has a different focus, it is merely a building block for an application, and has to be used with a little bit of responsibility. And even if we start to implement all possible checks, I'm afraid we never can do without a warning in the documentation that reminds everyone to keep an eye on what they're doing. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21109 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue15955] gzip, bz2, lzma: add option to limit output size
Nadeem Vawda added the comment: I've posted a review at http://bugs.python.org/review/15955/. (For some reason, it looks like Rietveld didn't send out email notifications. But maybe it never sends a notification to the sender? Hmm.) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue15955 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21109] tarfile: Traversal attack vulnerability
Larry Hastings added the comment: Thank you Lars for your thorough reply. While I agree that this isn't a release blocker, as it was clearly designed to behave this way... it seems to me that it wouldn't take much to make the tarfile module a lot safer. Specifically: * Don't allow creating files whose absolute path is not under the destination. * Don't allow creating links (hard or soft) which link to a path outside of the destination. * Don't create device nodes. This would fix your listed attacks 1-6. The remaining attacks you cite are denial-of-service attacks; while they're undesirable, they shouldn't compromise the security of the machine. (I suppose we could even address those, adding reasonable quotas for disk space and number of files.) I doubt that would make tarfile secure. But maybe practicality beats purity? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21109 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21111] PyLong_AsUnsignedLongAndOverflow does not exist
Hristo Venev added the comment: I did not intend to make it so that PyLong_AsUnsignedLong* to call __int__ but it looks like a good idea because PyLong_AsLong* does that and the patch is exactly about that - removing differences between converting to signed and unsigned. Should I upload the patch with docs and with the warning fixed? Will it be applied in 3.4.1? It will be a lot easier to check the value of an int than to create a new PyLong = 2^64 and compare the number with that. P.S.: is there a very fast way to check if a PyLong fits in unsigned long long with the limited API? P.P.S.: Just a random idea: would it be a to rewrite PyLong to use GMP instead as a PyVarObject of mp_limb_t's? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue2 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21111] PyLong_AsUnsignedLongAndOverflow does not exist
Mark Dickinson added the comment: it looks like a good idea It's not a good idea, though: you risk breaking existing third-party extension modules in surprising ways, which isn't a friendly thing to do. In any case, if anything the PyLong methods should be moving *away* from use of the nb_int slot. This has been discussed multiple times previously; one reason that we haven't made that change yet is the problem of breaking backwards compatibility. Should I upload the patch with docs and with the warning fixed? Sure, that would be great if you have the time. Tests would be useful too, but one of us can fill those in (eventually). It all depends how much of a hurry you're in. :-) Will it be applied in 3.4.1? No: as already explained by others, it's a enhancement, not a bugfix, so it can't be applied until 3.5. P.S.: is there a very fast way to check if a PyLong fits in unsigned long long with the limited API? Depends on your definition of 'very fast'. There should be no need to compare with a constant: just try the conversion with PyLong_AsUnsignedLong and see if you get an OverflowError back. My guess is that that's not going to be much slower than a custom PyLong_AsUnsignedLongAndOverflow, but the only way you'll find out is by timing your particular use-case. P.P.S.: Just a random idea: would it be a to rewrite PyLong to use GMP instead as a PyVarObject of mp_limb_t's? I'll let Victor answer that one. :-) In the mean time, see issue 1814. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue2 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21139] Idle: change default reformat width from 70 to 72
Changes by Saimadhav Heblikar saimadhavhebli...@gmail.com: -- nosy: +sahutd ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21139 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue15955] gzip, bz2, lzma: add option to limit output size
Serhiy Storchaka added the comment: Yes, Rietveld never sends a notification to the sender. I've received a notification. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue15955 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21145] Add the @cached_property decorator
Changes by R. David Murray rdmur...@bitdance.com: -- nosy: +r.david.murray ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21145 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue20383] Add a keyword-only spec argument to types.ModuleType
Brett Cannon added the comment: I can dream about getting rid of the attributes, but I doubt it would happen any time soon, if at all. But we do need to make it easier to set __spec__ on a new module than it currently is to help promote its use. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue20383 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Richard Oudkerk added the comment: I would guess that the problem is simply that LogisticRegression objects are not picklable. Does the problem still occur if you do not use freeze? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Ivan K added the comment: Sorry, I'm not sure I describe it correct. Freeze that means = goes fozen, so stop progress. It's do no do anything, but computations still load single core of my cpu for 100% untill I do not kill the python process. But the same code work's fine if executed outside pool, or on the different platforms (like Mac for example), my friend run it successfully, but for me it not works. So it's look like pretty unpredictable issue with conflicting OS, CPU, and Python I'm afraid. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Richard Oudkerk added the comment: Ah, I misunderstood: you meant that it freezes/hangs, not that you used a freeze tool. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Ivan K added the comment: Yes, I'm not using any tool. Code just not working. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue20998] fullmatch isn't matching correctly under re.IGNORECASE
Gareth Gouldstone added the comment: fullmatch() is not yet implemented on the regex scanner object SRE_Scanner (issue 21002). Is it possible to adapt this patch to fix this omission? -- nosy: +Gareth.Gouldstone ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue20998 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue1191964] asynchronous Subprocess
Josiah Carlson added the comment: Should have uploaded this yesterday, but I got caught up with typical weekend activities. The docs probably need more, and I hear that select.select() is disliked, so that will probably need to be changed too. -- Added file: http://bugs.python.org/file34743/subprocess.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue1191964 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Richard Oudkerk added the comment: Could you try pickling and unpickling the result of func(): import cPickle data = cPickle.dumps(func([1,2,3]), -1) print cPickle.loads(data) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Ivan K added the comment: Sorry, could you specify what is 'func' ? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Ivan K added the comment: Sorry, stupdi question. Forget that this is from my gist -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21162] code in multiprocessing.pool freeze if inside some code from scikit-learn (and probably liblinear) executed on ubuntu 12.04 64 Bit
Ivan K added the comment: Yes, it work fine and output was LogisticRegression(C=1.0, class_weight=None, dual=False, fit_intercept=True, intercept_scaling=1, penalty=l2, random_state=None, tol=0.0001) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21162 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue12014] str.format parses replacement field incorrectly
Changes by Benjamin Peterson benja...@python.org: -- resolution: - fixed status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue12014 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21118] str.translate is absurdly slow in majority of use cases (takes up to 60x longer than similar functions)
Josh Rosenberg added the comment: Thanks for addressing this so fast. Annoyingly, I suspect it will not help the original case that led me to finding the slowdown (I had some code that was translating from 56 latin-1 Romance characters with diacritics to the equivalent ASCII characters, so it was 1-1, but it was always mapping from non-ASCII to ASCII, and therefore won't benefit from a change that only caches code points 0-127). That said, every *other* time I've used str.translate has been in an ASCII-ASCII scenario, where this *will* help. So again, thank you. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21118 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21119] asyncio create_connection resource warning
Lars Andersson added the comment: Thanks Victor, that fixes my problem. I've started using tulip/master as part of my project as that also solves other issues I have with the default asyncio of python 3.4.0, but hopefully this fix will into tulip/master as well as python 3.4.1 / 3.5. -- resolution: - works for me ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21119 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue21164] print unicode in Windows console
New submission from Leslie Klein: The console behaves by encoding a str (using the sys.getdefaultencoding()) and then decodes the bytes to a glyph for rendering. The decoder used is 'cp437'. Apparently, there is no way to override that! See ipython notebook for summary and example of the issue. nbviewer.ipython.org/gist/LeslieK/10013426 -- components: Windows messages: 215675 nosy: LeslieK priority: normal severity: normal status: open title: print unicode in Windows console type: behavior versions: Python 3.3 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue21164 ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com