Re: Is there a function that applies list of functions to a value?
On Friday, August 30, 2013 4:09:45 AM UTC+2, Steven D'Aprano wrote: On Thu, 29 Aug 2013 13:50:39 -0700, fp2161 wrote: My way is so obvious that it may not be that interesting... def func4(f1,f2,f3,f4): def anon(x): f1(f2(f3(f4(x return anon I don't think obvious is quite the right description. Well, perhaps obviously wrong :-) You also need to define func1 (trivial), func2, func3, func5, func6, func7, func8, ..., func2147483647, plus another master function to choose between them, depending on the number of functions provided as argument. I assume that the maximum number of arguments given is 2**31-1. Python may not actually have that limitation, in which case you would need to define additional functions. Or... you would have to come up with an implementation which doesn't hard- code the number of functions used. Steven I got the generalisation criticism before yours, and generalised it accordingly. Unfortunately it was wrong essentially because it was so obvious that Josh Englsih posted essentially the same one before me... -- http://mail.python.org/mailman/listinfo/python-list
Re: Is there a function that applies list of functions to a value?
On Thursday, August 29, 2013 11:35:39 PM UTC+2, Chris Angelico wrote: On Fri, Aug 30, 2013 at 7:27 AM, fp2...@gmail.com wrote: Chris, call me a snob, but I resent using lambdas (aren't they usually considered odd/bad practice in python?) They're not bad practice; all they are is a function without a name, that's restricted to returning a single expression. So they're perfectly suited to this task, and ill-suited to some others. Like everything, lambda's a tool that can be used or misused. ChrisA For this purpose however, I suspect that a named function with a proper docstring that can be imported and reused over and over again is probably more appropriate than a lambda (the first post is telling us about something happening from time to time...) -- http://mail.python.org/mailman/listinfo/python-list
Re: Is there a function that applies list of functions to a value?
On Wednesday, August 28, 2013 8:50:53 PM UTC+2, Josh English wrote: Reduce tricks are nice, but I prefer clarity sometimes: def double(x): return x*2 def add3(x): return x+3 def compose(*funcs): for func in funcs: if not callable(func): raise ValueError('Must pass callable functions') def inner(value): for func in funcs: value = func(value) return value return inner add_then_double = compose(add3, double) double_then_add = compose(double, add3) print add_then_double(1) # prints 8 print double_then_add(1) # prints 5 This is my favourite design, simple, clear, straightforward, very pythonic imho. So great that I actually dod not notice it, and wrote it again after you! Imho still, the ValueError you are raising is not that important in this context, it would raise an Error anyway. -- http://mail.python.org/mailman/listinfo/python-list
Re: Is there a function that applies list of functions to a value?
On 30/08/2013 4:14 PM, fp2...@gmail.com wrote: For this purpose however, I suspect that a named function with a proper docstring that can be imported and reused over and over again is probably more appropriate than a lambda Given that in Chris' example the lambda was returned from a factory, importing the inner function by name is never going to be a concern. It's also possible to assign to both __name__ and __doc__ for a lambda, which can be useful at times. -- http://mail.python.org/mailman/listinfo/python-list
Re: Is there a function that applies list of functions to a value?
On 30/08/2013 4:17 PM, fp2...@gmail.com wrote: On Wednesday, August 28, 2013 8:50:53 PM UTC+2, Josh English wrote: def compose(*funcs): for func in funcs: if not callable(func): raise ValueError('Must pass callable functions') Imho still, the ValueError you are raising is not that important in this context, it would raise an Error anyway. The main advantage with Josh's approach is that it fails at the point of composition, not when the composed function is first used. It'd be even more useful if it aggregated a list of the failing functions and returned their names as part of the error. Personally, I'd go with an assertion: assert all(map(callable, funcs)), Must pass callable functions I find that it makes it more obvious that this is part of the function contract rather than the actual body. -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
Op 30-08-13 06:55, Ben Finney schreef: Ben Finney ben+pyt...@benfinney.id.au writes: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. for foo in (spam for spam in sequence if predicate(spam)): … Better: for foo in filter(predicate, sequence): process(foo) Well better in what way? You now have to translate a predicate expression into a predicate function. Which AFAIU was one of the reasons to move away from map/filter to list comprehension. As I understand it, python made a move away from map and filter towards list comprehension. Chris seems to want some of the possibilities that came with that incorporated into the for statement. And your suggestion is to go back to the old kind of filter way. -- Antoon Pardon -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
On Fri, Aug 30, 2013 at 5:15 PM, Antoon Pardon antoon.par...@rece.vub.ac.be wrote: Op 30-08-13 06:55, Ben Finney schreef: Ben Finney ben+pyt...@benfinney.id.au writes: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. for foo in (spam for spam in sequence if predicate(spam)): … Better: for foo in filter(predicate, sequence): process(foo) Well better in what way? You now have to translate a predicate expression into a predicate function. Which AFAIU was one of the reasons to move away from map/filter to list comprehension. As I understand it, python made a move away from map and filter towards list comprehension. Chris seems to want some of the possibilities that came with that incorporated into the for statement. And your suggestion is to go back to the old kind of filter way. No, actually Ben's quite right - assuming the predicate is a simple function, of course (Python's lambda notation is a bit clunky for comparisons); as of Python 3, filter() is lazy and is pretty much what I'm doing here. However, that's still a specific answer to a specific (albeit common) instance of wanting to merge control structures. ChrisA -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
Op 30-08-13 09:25, Chris Angelico schreef: On Fri, Aug 30, 2013 at 5:15 PM, Antoon Pardon antoon.par...@rece.vub.ac.be wrote: Op 30-08-13 06:55, Ben Finney schreef: Ben Finney ben+pyt...@benfinney.id.au writes: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. for foo in (spam for spam in sequence if predicate(spam)): … Better: for foo in filter(predicate, sequence): process(foo) Well better in what way? You now have to translate a predicate expression into a predicate function. Which AFAIU was one of the reasons to move away from map/filter to list comprehension. As I understand it, python made a move away from map and filter towards list comprehension. Chris seems to want some of the possibilities that came with that incorporated into the for statement. And your suggestion is to go back to the old kind of filter way. No, actually Ben's quite right - assuming the predicate is a simple function, But why should we assume that? Suppose I would like to process all odd items in a list. A comprehension kind of notation would be | for item in lst if item % 2: | process items Which we would have to turn into | for item in filter(lambda nr: nr % 2, lst): | process items But AFAIR, one of the driving forces behind the introduction to list comprehension, and thus a move away from map and filter was to get rid of the lambda's in this kind of situations. of course (Python's lambda notation is a bit clunky for comparisons); as of Python 3, filter() is lazy and is pretty much what I'm doing here. Lazy or not, is AFAICS, not a point here. -- Antoon Pardon -- http://mail.python.org/mailman/listinfo/python-list
Re: Is there a function that applies list of functions to a value?
On Friday, August 30, 2013 8:23:44 AM UTC+2, alex23 wrote: On 30/08/2013 4:14 PM, fp2...@gmail.com wrote: For this purpose however, I suspect that a named function with a proper docstring that can be imported and reused over and over again is probably more appropriate than a lambda Given that in Chris' example the lambda was returned from a factory, importing the inner function by name is never going to be a concern. It's also possible to assign to both __name__ and __doc__ for a lambda, which can be useful at times. I am sorry, I can't see any of Chris' code in this thread, where is it??? -- http://mail.python.org/mailman/listinfo/python-list
Re: Is there a function that applies list of functions to a value?
On Friday, August 30, 2013 8:36:40 AM UTC+2, alex23 wrote: On 30/08/2013 4:17 PM, fp2...@gmail.com wrote: On Wednesday, August 28, 2013 8:50:53 PM UTC+2, Josh English wrote: def compose(*funcs): for func in funcs: if not callable(func): raise ValueError('Must pass callable functions') Imho still, the ValueError you are raising is not that important in this context, it would raise an Error anyway. The main advantage with Josh's approach is that it fails at the point of composition, not when the composed function is first used. It'd be even more useful if it aggregated a list of the failing functions and returned their names as part of the error. Personally, I'd go with an assertion: assert all(map(callable, funcs)), Must pass callable functions I find that it makes it more obvious that this is part of the function contract rather than the actual body. it is a valid point, but I would contend that it makes this quick and easy code a little bit heavy just for the sake of ensuring that you are composing composable functions... The assertion is definitely better. -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
On 29 Aug 2013 23:20, Ben Finney ben+pyt...@benfinney.id.au wrote: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. So you use it in a generator expression, and iterate over the generator: for foo in (spam for spam in sequence if predicate(spam)): process(spam) That way, there's no need for new syntax. The problem I have with that strategy is that it is repetitive and hinders readability. You wrote for and in twice, and spam (a pretty useless intermediate variable) thrice! While it does its job, it hides the true intent for filtering beneath a lot of (pun intended) spam. The if particle is nigh undetectable there. To get around this, I often declare a generator. But I still find it a bit awkward to have to look up the definition elsewhere, and to waste lines over something so simple. I can't say I understand why we don't merge the for loops' syntax with the comprehension syntax. Even after following the for..while discussion. -- http://mail.python.org/mailman/listinfo/python-list
Re: Interface and duck typing woes
In article 52200699$0$6599$c3e8da3$54964...@news.astraweb.com, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: These days, it would be relatively simple to implement pre- and post- condition checking using decorators, and indeed one of the motivating use- cases for function annotations in Python 3 is to allow such things. http://www.python.org/dev/peps/pep-3107/ (Function annotations are perhaps the best Python feature that nobody uses.) This is awesome. -- http://mail.python.org/mailman/listinfo/python-list
python script to gather file links into textfile
Hi, im looking for someone who can make a script that gathers all file links from an url into a textfile, like this : http://pastebin.com/jfD31r1x -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
In article mailman.385.1377858745.19984.python-l...@python.org, Fábio Santos fabiosantos...@gmail.com wrote: On 29 Aug 2013 23:20, Ben Finney ben+pyt...@benfinney.id.au wrote: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. So you use it in a generator expression, and iterate over the generator: for foo in (spam for spam in sequence if predicate(spam)): process(spam) That way, there's no need for new syntax. The problem I have with that strategy is that it is repetitive and hinders readability. You wrote for and in twice, and spam (a pretty useless intermediate variable) thrice! While it does its job, it hides the true intent for filtering beneath a lot of (pun intended) spam. The if particle is nigh undetectable there. To get around this, I often declare a generator. But I still find it a bit awkward to have to look up the definition elsewhere, and to waste lines over something so simple. I can't say I understand why we don't merge the for loops' syntax with the comprehension syntax. Even after following the for..while discussion. +1 on loop comprehensions, for all the reasons Fábios states. -- http://mail.python.org/mailman/listinfo/python-list
Re: How to keep cookies when making http requests (Python 2.7)
Thanks Dieter, With respect to cookie handling, you do everything right. There may be other problems with the (wider) process. Analysing the responses of your requests (reading the status codes, the response headers and the response bodies) may provide hints towards the problem. I will try to do that and try to see if I can figure out why. Do I misunderstand something in the process? Not with respect to cookie handling. -- http://mail.python.org/mailman/listinfo/python-list
Re: python script to gather file links into textfile
On 2013-08-30, david.d...@gmail.com david.d...@gmail.com wrote: Hi, im looking for someone who can make a script that gathers all file links from an url into a textfile, like this : http://pastebin.com/jfD31r1x Michael Jackson advises you to start with the man in the mirror. -- Neil Cerutti -- http://mail.python.org/mailman/listinfo/python-list
Re: python script to gather file links into textfile
Le 30/08/2013 15:01, Neil Cerutti a écrit : On 2013-08-30, david.d...@gmail.com david.d...@gmail.com wrote: Hi, im looking for someone who can make a script that gathers all file links from an url into a textfile, like this : http://pastebin.com/jfD31r1x 1. Read the file with urls http://stackoverflow.com/questions/3925614/how-do-you-read-a-file-into-a-list-in-python 2. Download each url http://stackoverflow.com/questions/22676/how-do-i-download-a-file-over-http-using-python Michael Jackson advises you to start with the man in the mirror. :D -- http://mail.python.org/mailman/listinfo/python-list
Re: Question about XMLRPC
Στις 29/8/2013 6:30 μμ, ο/η Ferrous Cranus έγραψε: Στις 29/8/2013 3:35 μμ, ο/η Ferrous Cranus έγραψε: Στις 29/8/2013 2:54 μμ, ο/η Gregory Ewing έγραψε: i.she...@gmail.com wrote: I should write a python script(s) that listens to an existing XMLRPC service on my company's dev server. then i should parse that and return to the existing XML-RPC, or write the parsed data to the Posgresql database. but i'm not permitted to edit the existing XMLRPC service. It's not clear exactly what you mean by this. Are you trying to intercept XMLRPC requests sent to an existing service and do something different with them? To do that without modifying the existing service, you would need to change your web server's configuration to redirect the url of the service to a handler of your own. Are you able to do that, or get your administrators to do it? test test test -- Webhost http://superhost.gr -- http://mail.python.org/mailman/listinfo/python-list
صفحتنا على الفيس بوك اشترك الان اخباريه وترفيه وصور
صفحتنا على الفيس بوك اشترك الان اخباريه وترفيه وصور https://www.facebook.com/pages/%D9%86%D8%AA%D8%A7%D8%A6%D8%AC-%D8%A7%D9%84%D8%A7%D9%85%D8%AA%D8%AD%D8%A7%D9%86%D8%A7%D8%AA-%D9%88%D8%A7%D9%84%D8%AC%D8%A7%D9%85%D8%B9%D8%A7%D8%AA-%D9%88%D8%A7%D8%AC%D8%AA%D9%85%D8%A7%D8%B9%D9%8A%D8%A7%D8%AA/299719160065550?ref=hl -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
Op 30-08-13 12:53, Roy Smith schreef: In article mailman.385.1377858745.19984.python-l...@python.org, Fábio Santos fabiosantos...@gmail.com wrote: On 29 Aug 2013 23:20, Ben Finney ben+pyt...@benfinney.id.au wrote: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. So you use it in a generator expression, and iterate over the generator: for foo in (spam for spam in sequence if predicate(spam)): process(spam) That way, there's no need for new syntax. The problem I have with that strategy is that it is repetitive and hinders readability. You wrote for and in twice, and spam (a pretty useless intermediate variable) thrice! While it does its job, it hides the true intent for filtering beneath a lot of (pun intended) spam. The if particle is nigh undetectable there. To get around this, I often declare a generator. But I still find it a bit awkward to have to look up the definition elsewhere, and to waste lines over something so simple. I can't say I understand why we don't merge the for loops' syntax with the comprehension syntax. Even after following the for..while discussion. +1 on loop comprehensions, for all the reasons Fábios states. Maybe python should just allow more than one control structure on one line and then considers the end of the suite the end of both controls. In that case we could just write the following: for a in lst: if a % 2: treat a process a some more after loop things. It technically is not loop comprehension but just nested controls but by allowing them on the same line it looks close enough like loop comprehension. Plus if people have a problem that would best be solved by a while loop comprehension that would be just as easy. -- Antoon Pardon -- http://mail.python.org/mailman/listinfo/python-list
Re: Lettuce vs Behave
On Friday, August 16, 2013 1:15:01 PM UTC-4, cutems93 wrote: I found that BDD is a very good philosophy for coding and checking my program, and I decided to use either of these two software. However, it seems these two are very similar in the way they function. As professionals, what do you prefer and why? +1 for Behave -J -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen instance hangs
On Thu, 29 Aug 2013 17:00:21 -0800, Tim Johnson wrote: ## This appears to be what works. def __exec(self,args) : Run the process with arguments p = subprocess.Popen(args,stderr=subprocess.PIPE,stdout=subprocess.PIPE) while 1 : output = p.stdout.read() If the process tries to write more than a pipe's worth of data to stderr, before closing stdout, it will block indefinitely. If you want to process both stdout and stderr, you have to be able to consume the data in whatever order the process generates it, which means either using multiple threads or (on Unix) select/poll or non-blocking I/O. This is what the .communicate() method does (threads on Windows, select/poll on Unix). The alternative is to merge both streams with stderr=subprocess.STDOUT, or redirect one of them to a file (or /dev/null, etc). -- http://mail.python.org/mailman/listinfo/python-list
Re: python script to gather file links into textfile
On 2013-08-30, david.d...@gmail.com david.d...@gmail.com wrote: Hi, im looking for someone who can make a script that gathers all file links from an url into a textfile, like this : http://pastebin.com/jfD31r1x Sometimes its good to look in the closets or under the beds. People forget what they put in those places sometimes. -- Joel Goldstick http://joelgoldstick.com -- http://mail.python.org/mailman/listinfo/python-list
Python Weekend Challenge - $$
https://gist.github.com/mjhea0/6390724 Check it out!:) Have a great labor day weekend. -- http://mail.python.org/mailman/listinfo/python-list
Re: python script to gather file links into textfile
David, actually, what is the url? I read about you a little and it looks like you are into music but not a software guy. You might be better off finding someone local to write the code for you. People here generally help people with actual coding problems, or discuss aspects of python, but if you want to hire someone you will need to give more specific details of what you want done On Fri, Aug 30, 2013 at 10:56 AM, Joel Goldstick joel.goldst...@gmail.com wrote: On 2013-08-30, david.d...@gmail.com david.d...@gmail.com wrote: Hi, im looking for someone who can make a script that gathers all file links from an url into a textfile, like this : http://pastebin.com/jfD31r1x Sometimes its good to look in the closets or under the beds. People forget what they put in those places sometimes. -- Joel Goldstick http://joelgoldstick.com -- Joel Goldstick http://joelgoldstick.com -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen instance hangs
On Fri, Aug 30, 2013 at 11:32 AM, Tim Johnson t...@akwebsoft.com wrote: The objective is to display all output, but to also separate error messages from normal output. I still think you want to use communicate(). Like this: p = subprocess.Popen(args,stderr=subprocess.PIPE,stdout=subprocess.PIPE) output, err = p.communicate() That's it. No need for a loop, or manually handling the fact that stderr and/or stdout could end up with a full buffer and start to block. -- Jerry -- http://mail.python.org/mailman/listinfo/python-list
web2py - running on fedora
Hi. I know this is a python list, but hoping that I can find someone to help get a base install of web2py running. I've got an older version of fedora, running py 2.6.4 running apache v2.2 I'm simply trying to get web2py up/running, and then to interface it with apache, so that the existing webapps (php apps) don't get screwed up. Thanks -bruce -- http://mail.python.org/mailman/listinfo/python-list
sax.handler.Contenthandler.__init__
This code is from The Python Cookbook, 2nd edition, 12.2 Counting Tags in a Document: from xml.sax.handler import ContentHandler import xml.sax class countHandler(ContentHandler): def __init__(self): self.tags={} def startElement(self, name, attr): self.tags[name] = 1 + self.tags.get(name, 0) Isn't overriding __init__ a risky thing to do? The docs don't mention it as a method I should override, and also don't define what's in there or if I'd need to call the base class __init__. Moreover, startDocument is provided for parser setup. As it happens, ContentHandler.__init__ isn't empty, so the above code could fail if the parser isn't prepared for _locator to be undefined. Is the above code is an acceptable idiom? -- Neil Cerutti -- http://mail.python.org/mailman/listinfo/python-list
Re: web2py - running on fedora
On Fri, Aug 30, 2013 at 1:13 PM, bruce badoug...@gmail.com wrote: Hi. I know this is a python list, but hoping that I can find someone to help get a base install of web2py running. I've got an older version of fedora, running py 2.6.4 running apache v2.2 I'm simply trying to get web2py up/running, and then to interface it with apache, so that the existing webapps (php apps) don't get screwed up. Thanks -bruce -- http://mail.python.org/mailman/listinfo/python-list Do you know about google? I just seached this: 'installing web2py apache' and I got whole lot of information. Why don't you try to get it running and come back with a specific question if you get stuck? -- Joel Goldstick http://joelgoldstick.com -- http://mail.python.org/mailman/listinfo/python-list
RE: sax.handler.Contenthandler.__init__
Neil Cerutti wrote: This code is from The Python Cookbook, 2nd edition, 12.2 Counting Tags in a Document: from xml.sax.handler import ContentHandler import xml.sax class countHandler(ContentHandler): def __init__(self): self.tags={} def startElement(self, name, attr): self.tags[name] = 1 + self.tags.get(name, 0) Isn't overriding __init__ a risky thing to do? The docs don't mention it as a method I should override, and also don't define what's in there or if I'd need to call the base class __init__. Moreover, startDocument is provided for parser setup. As it happens, ContentHandler.__init__ isn't empty, so the above code could fail if the parser isn't prepared for _locator to be undefined. Is the above code is an acceptable idiom? -- Neil Cerutti -- I think this is a bad idea unless you want to avoid the parent class __init__ specifically (in which case a comment stating why is mandatory). I do not like that this recipe shows behavior that might be fine in this instance, but is not a good general practice. def __init__(self): super(ContentHandler, self).__init__() #OR ContentHandler.__init__(self) self.tags={} I personally think the super() line is better of the two options. ~Ramit This email is confidential and subject to important disclaimers and conditions including on offers for the purchase or sale of securities, accuracy and completeness of information, viruses, confidentiality, legal privilege, and legal entity disclaimers, available at http://www.jpmorgan.com/pages/disclosures/email. -- http://mail.python.org/mailman/listinfo/python-list
Best practice for generalizing and documenting each method's behaviour
I'm starting a small project coding in Python as I learn the ropes. As the project grows bigger, there are more and more overlapping and even redundant methods. For example, several classes have a checkAndClean_obj_state() method. If just one or two such classes, it is easy to analyze the behaviour of them and design the optimal interaction for all objects. However, when there are many of such classes, exactly at what point to invoke check and clean behaviour becomes a little blurred. There is a desperate need for generalizing and documenting the behaviour of each such class, preferably in a flowchart. I'm currently doing the flowchart manually but the job becomes a bit overwhelming. I wonder what Python pros are using for analyzing and documenting classes/functions behaviours and interactions? Is UML the only way? Personally I found UML is a bit overkill for a one person project, but I'm not sure if it is the right direction. I'd appreciate any insight. Many thanks. -- http://mail.python.org/mailman/listinfo/python-list
ANN: A new version (0.3.5) of python-gnupg has been released.
A new version of the Python module which wraps GnuPG has been released. What Changed? = This is a minor enhancement and bug-fix release. See the project website ( http://code.google.com/p/python-gnupg/ ) for more information. Summary: Added improved shell quoting to guard against shell injection attacks. Added search_keys() and send_keys() methods to interact with keyservers. A symmetric cipher algorithm can now be specified when encrypting. UTF-8 encoding is used as a fall back when no other encoding can be determined. The key length now defaults to 2048 bits. A default Name-Comment field is no longer provided during key generation. What Does It Do? The gnupg module allows Python programs to make use of the functionality provided by the Gnu Privacy Guard (abbreviated GPG or GnuPG). Using this module, Python programs can encrypt and decrypt data, digitally sign documents and verify digital signatures, manage (generate, list and delete) encryption keys, using proven Public Key Infrastructure (PKI) encryption technology based on OpenPGP. This module is expected to be used with Python versions = 2.4, as it makes use of the subprocess module which appeared in that version of Python. This module is a newer version derived from earlier work by Andrew Kuchling, Richard Jones and Steve Traugott. A test suite using unittest is included with the source distribution. Simple usage: import gnupg gpg = gnupg.GPG(gnupghome='/path/to/keyring/directory') gpg.list_keys() [{ ... 'fingerprint': 'F819EE7705497D73E3CCEE65197D5DAC68F1AAB2', 'keyid': '197D5DAC68F1AAB2', 'length': '1024', 'type': 'pub', 'uids': ['', 'Gary Gross (A test user) gary.gr...@gamma.com']}, { ... 'fingerprint': '37F24DD4B918CC264D4F31D60C5FEFA7A921FC4A', 'keyid': '0C5FEFA7A921FC4A', 'length': '1024', ... 'uids': ['', 'Danny Davis (A test user) danny.da...@delta.com']}] encrypted = gpg.encrypt(Hello, world!, ['0C5FEFA7A921FC4A']) str(encrypted) '-BEGIN PGP MESSAGE-\nVersion: GnuPG v1.4.9 (GNU/Linux)\n \nhQIOA/6NHMDTXUwcEAf ... -END PGP MESSAGE-\n' decrypted = gpg.decrypt(str(encrypted), passphrase='secret') str(decrypted) 'Hello, world!' signed = gpg.sign(Goodbye, world!, passphrase='secret') verified = gpg.verify(str(signed)) print Verified if verified else Not verified 'Verified' For more information, visit http://code.google.com/p/python-gnupg/ - as always, your feedback is most welcome (especially bug reports, patches and suggestions for improvement). Enjoy! Cheers Vinay Sajip Red Dove Consultants Ltd. -- http://mail.python.org/mailman/listinfo/python-list
Re: Encapsulation unpythonic?
On Saturday, August 17, 2013 2:26:32 PM UTC+2, Fernando Saldanha wrote: I am new to Python, with experience in Java, C++ and R. As I understand encapsulation is not a big thing in the Python world. I read that you can put two underscores before the name of a variable within a class declaration but in the many examples of code I looked at this is not widely used. I also read that encapsulation is unpythonic. Questions: 2) If it is in fact true that encapsulation is rarely used, how do I deal with the fact that other programmers can easily alter the values of members of my classes? Fernando, it is widely accepted that Python pays very little attention to encapsulation as a principle set in stone. Chaz's definition of encapsulation is also mine. Now you need to consider that taking this principle off the hostel of OOP does not mean that you can do whatever you fancy and you can't make anything unsettable. There are plenty of techniques within Python that allow you to protect your arguments (in particular, decorators) inside a Class. Now, lets get to the pretentious philosophical discussion: I guess encapsulation is quite the opposite of, say, dynamic typing, which is arguably core in Python. In practice this allows Python to be less verbose: at the end of the day, if you look back at your previous languages, don't you find that some of their compulsory features are usually more of a pain than something useful in practice? And after all, whither encapsulation? Can't we just have objects whose arguments are determined externally if we want to? And that is the ballgame: as my old tutor says: the claptrap of setters and getters does not need to be here if it is unnecessary. I would add: so long as you can have them when you deem it necessary, and Python allows that. -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen instance hangs
* Nobody nob...@nowhere.com [130830 06:55]: On Thu, 29 Aug 2013 17:00:21 -0800, Tim Johnson wrote: ## This appears to be what works. def __exec(self,args) : Run the process with arguments p = subprocess.Popen(args,stderr=subprocess.PIPE,stdout=subprocess.PIPE) while 1 : output = p.stdout.read() If the process tries to write more than a pipe's worth of data to stderr, before closing stdout, it will block indefinitely. If you want to process both stdout and stderr, you have to be able to consume the data in whatever order the process generates it, which means either using multiple threads or (on Unix) select/poll or non-blocking I/O. This is what the .communicate() method does (threads on Windows, select/poll on Unix). The alternative is to merge both streams with stderr=subprocess.STDOUT, or redirect one of them to a file (or /dev/null, etc). In earlier code I, I was merging them... :) Like I said: gnarly! What if I were to do something like: ## code while 1: output = p.stout.read() err = p.stderr.read() ## trapping for AttributeError, etc.. ## /code break'ing if either no output or value in `err' ?? The objective is to display all output, but to also separate error messages from normal output. thank you -- Tim tim at tee jay forty nine dot com or akwebsoft dot com http://www.akwebsoft.com -- http://mail.python.org/mailman/listinfo/python-list
Re: subprocess.Popen instance hangs
* Jerry Hill malaclyp...@gmail.com [130830 07:48]: On Fri, Aug 30, 2013 at 11:32 AM, Tim Johnson t...@akwebsoft.com wrote: The objective is to display all output, but to also separate error messages from normal output. I still think you want to use communicate(). Like this: p = subprocess.Popen(args,stderr=subprocess.PIPE,stdout=subprocess.PIPE) output, err = p.communicate() That's it. No need for a loop, or manually handling the fact that stderr and/or stdout could end up with a full buffer and start to block. The following code : p = subprocess.Popen(args,stderr=subprocess.PIPE,stdout=subprocess.PIPE) errmsg,output = p.communicate() ... Hangs this code : p = subprocess.Popen(args,stderr=subprocess.PIPE,stdout=subprocess.PIPE) while 1 : output = p.stdout.read() if output : print(output) else : break ... works -- Tim tim at tee jay forty nine dot com or akwebsoft dot com http://www.akwebsoft.com -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
On Sat, Aug 31, 2013 at 12:14 AM, Antoon Pardon antoon.par...@rece.vub.ac.be wrote: Maybe python should just allow more than one control structure on one line and then considers the end of the suite the end of both controls. In that case we could just write the following: for a in lst: if a % 2: treat a process a some more after loop things. Exactly what I suggested a while ago :) This is how we got onto this subject - a discussion of how physical structure and logical structure can differ. ChrisA -- http://mail.python.org/mailman/listinfo/python-list
Re: Lettuce vs Behave
jumpmanl...@myopera.com writes: On Friday, August 16, 2013 1:15:01 PM UTC-4, cutems93 wrote: As professionals, what do you prefer and why? +1 for Behave And why? -- \ “In the long run, the utility of all non-Free software | `\ approaches zero. All non-Free software is a dead end.” —Mark | _o__)Pilgrim, 2006 | Ben Finney -- http://mail.python.org/mailman/listinfo/python-list
Gunstar - Another python web framework.
It's a recent project, check this out: http://github.com/allisson/gunstar -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
On Fri, 30 Aug 2013 11:32:17 +0100, Fábio Santos wrote: On 29 Aug 2013 23:20, Ben Finney ben+pyt...@benfinney.id.au wrote: Fábio Santos fabiosantos...@gmail.com writes: It is a shame that this is not possible in python. for..if exists in comprehensions and not in regular loops but that would be nice sometimes. So you use it in a generator expression, and iterate over the generator: for foo in (spam for spam in sequence if predicate(spam)): process(spam) That way, there's no need for new syntax. The problem I have with that strategy is that it is repetitive and hinders readability. You wrote for and in twice, and spam (a pretty useless intermediate variable) thrice! There is no need for spam to be intermediate, and the fact that it shouldn't be is demonstrated by Ben's error in referring to process (spam) instead of process(foo). We really are spoiled for choice here. We can write any of these: # Option 1 for spam in sequence: if predicate(spam): process(spam) # Option 2 for spam in filter(predicate, sequence): process(spam) # Option 3 for spam in (spam for spam in sequence if predicate(spam)): process(spam) Adding a fourth option: for spam in sequence if predicate(spam): process(spam) saves absolutely nothing except a line and an indent level, neither of which are in short supply, and gains nothing in readability over Option 1. While it does its job, it hides the true intent for filtering beneath a lot of (pun intended) spam. The if particle is nigh undetectable there. To get around this, I often declare a generator. But I still find it a bit awkward to have to look up the definition elsewhere, and to waste lines over something so simple. No need to look up a definition elsewhere, you can put the generator right next to the loop: gen = (spam for spam in sequence if predicate(spam)) for spam in gen: process(spam) But of all the options shown, including the hypothetical for...if, I still prefer Option 1, or Option 2 as my second option. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: Interface and duck typing woes
On Fri, 30 Aug 2013 06:35:47 -0400, Roy Smith wrote: In article 52200699$0$6599$c3e8da3$54964...@news.astraweb.com, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: These days, it would be relatively simple to implement pre- and post- condition checking using decorators, and indeed one of the motivating use- cases for function annotations in Python 3 is to allow such things. http://www.python.org/dev/peps/pep-3107/ (Function annotations are perhaps the best Python feature that nobody uses.) This is awesome. Heh, everybody has one of two reactions: This is awesome! You'll add type checking to my Python code over my dead body!!! But I'm still to see a practical use for annotations in real world code. Or indeed to think of a use for them other than type checking. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: python script to gather file links into textfile
On Fri, 30 Aug 2013 03:53:05 -0700, david.dsch wrote: Hi, im looking for someone who can make a script that gathers all file links from an url into a textfile, like this : http://pastebin.com/jfD31r1x You've come to the right place! My rate is AUD$100 an hour. Contact me if you are interested. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: Using PyQT with QT Designer
Lee Harr, thank you. I took your suggestion after I finished coding the audio section. You can see the improved project here: http://i.imgur.com/permuRQ.jpg On Friday, August 23, 2013 7:35:53 PM UTC-5, Lee Harr wrote: That's the problem though. It is exactly how I want it in designer. It's perfect as it is in designer when I preview it. Here is a screenshot of the preview: http://i.imgur.com/ULRolq8.png That's not a preview. That's just the regular design view. (you can tell by the little dots in the background) You need to go to Form - Preview... to see the actual preview. That said... 1.) You may want to ask your question on the PyQt mailing list. Though you are talking with the undisputed PyQt expert in Phil, there are more people on the other list who are familiar with PyQt and who may be willing to look more closely at your specific code. 2.) It may be that the examples you are looking at are not sufficient to help you with the situation you are in. For instance, I've written several programs using Designer and PyQt and I would recommend against using the pyuic method. When I first started with PyQt I also used pyuic and eventually I found the PyQt4.uic method works better for me. 3.) Layouts. You have to use them with Qt or you're going to have a bad time. Looking at your design, I would do something like ... - select the two buttons on the left and click Lay Out Vertically - select the two large white boxes and click Lay Out Vertically - put a vertical spacer underneath the red X button - select the red button and the spacer and click Lay Out Vertically - at this point you may need to resize and rearrange your three vertical layouts so that they don't overlap and are in approximately the positions that you want, then - select the main window and click Lay Out Horizontally Something along those lines would get you about to where you want to be. The form may not look _exactly_ the way you have it there, but it will be a more flexible design and nothing will be overlapping. -- http://mail.python.org/mailman/listinfo/python-list
Re: Interface and duck typing woes
On 8/30/13 8:13 PM, Steven D'Aprano wrote: On Fri, 30 Aug 2013 06:35:47 -0400, Roy Smith wrote: In article 52200699$0$6599$c3e8da3$54964...@news.astraweb.com, Steven D'Aprano steve+comp.lang.pyt...@pearwood.info wrote: These days, it would be relatively simple to implement pre- and post- condition checking using decorators, and indeed one of the motivating use- cases for function annotations in Python 3 is to allow such things. http://www.python.org/dev/peps/pep-3107/ (Function annotations are perhaps the best Python feature that nobody uses.) This is awesome. Heh, everybody has one of two reactions: This is awesome! You'll add type checking to my Python code over my dead body!!! But I'm still to see a practical use for annotations in real world code. Or indeed to think of a use for them other than type checking. At PyCon 2007 (I think), Guido was giving a keynote about the features coming in Py3k, and he couldn't remember the name function annotations. He said, what are they called, the things that aren't type declarations. --Ned. -- http://mail.python.org/mailman/listinfo/python-list
Re: Encapsulation unpythonic?
On Fri, 30 Aug 2013 10:43:28 -0700, Fabrice Pombet wrote: On Saturday, August 17, 2013 2:26:32 PM UTC+2, Fernando Saldanha wrote: 2) If it is in fact true that encapsulation is rarely used, how do I deal with the fact that other programmers can easily alter the values of members of my classes? Fernando, it is widely accepted that Python pays very little attention to encapsulation as a principle set in stone. Widely accepted by whom? Python code is *full* of encapsulation. Functions, methods, classes, modules, packages, even local variables, are all mechanisms for encapsulating code and data. Those who say that Python has little or no encapsulation are talking rubbish. Chaz's definition of encapsulation is also mine. Who is Chaz, and what definition does he have? Now you need to consider that taking this principle off the hostel of OOP does not mean that you can do whatever you fancy and you can't make anything unsettable. There are plenty of techniques within Python that allow you to protect your arguments (in particular, decorators) inside a Class. And now you are talking about information hiding and protection, which is not the same of encapsulation, no matter what the academics think. Sometimes the air gets a bit too thin to breathe way up at the top of those ivory towers... Encapsulation is about grouping code that needs to be together together. In contract, you have programming languages that give you little, or nothing, in the way of grouping -- everything is one big chunk of code, with GOTO or GOSUB to jump from place to place. Functions and procedures are the first, most simple, form of encapsulation. Classes allow you to encapsulate multiple functions (methods) together with the data they need to operate on in one chunk. Even in C++ or Java, you can have classes that provide no information hiding at all -- just declare everything public. On the other hand, non-OOP languages like C can implement information hiding. In C, you can hide information from other files by declaring them as static. Variables declared inside a brace-delimited block only exist within that block: local variables are hidden. For example: int foo; static int bar; bar is hidden from other files. Likewise, in this function: int func(void) { int baz; ... } baz is local to func, and invisible to any other function. So you can have information hiding without classes, and classes without information hiding. The two concepts are obviously independent, but as usual, the academics who are in love with OOP like to pretend that anything that is of any interest whatsoever in computing was invented by Java and C++. There are even languages with functions, but no local variables. For instance, older versions of Forth let you define functions, what Forth calls words, but all functions operate on the same global stack. Python has excellent encapsulation: we can combine code that ought to be together into a function, related functions into a class, related classes into a module, and related modules into a package. Now, lets get to the pretentious philosophical discussion: I guess encapsulation is quite the opposite of, say, dynamic typing, which is arguably core in Python. They are utterly unrelated. Dynamic typing has nothing to do with whether or not you can encapsulate code into chunks (subroutines, functions, modules, classes...) or whether you have to write one big amorphous unstructured program where every chunk of code can reach inside other chunks of code. Nor does dynamic type have to do with information hiding. You can have a private member of a class regardless of whether that member has a single fixed type enforced at compile-time, or a dynamically typed value enforced at run-time. -- Steven -- http://mail.python.org/mailman/listinfo/python-list
Re: semicolon at end of python's statements
On 8/30/2013 8:09 PM, Steven D'Aprano wrote: We really are spoiled for choice here. We can write any of these: # Option 1 for spam in sequence: if predicate(spam): process(spam) # Option 2 for spam in filter(predicate, sequence): process(spam) # Option 3 for spam in (spam for spam in sequence if predicate(spam)): process(spam) Adding a fourth option: for spam in sequence if predicate(spam): process(spam) saves absolutely nothing except a line and an indent level, neither of which are in short supply, and gains nothing in readability over Option 1. Which is why it has been rejected. But of all the options shown, including the hypothetical for...if, I still prefer Option 1, or Option 2 as my second option. Ditto. I think people would be better off spending more time learning more about how to use this incredibly powerful tool we have and less arguing for rejected redundant alternatives to what we do have. Just a few days ago, after 16 years of Python, I learned something really neat about function attributes of instances that made a certain testing problem disappear. -- Terry Jan Reedy -- http://mail.python.org/mailman/listinfo/python-list
[issue18828] urljoin behaves differently with custom and standard schemas
Martin Panter added the comment: Similarly, I expected this to return rtmp://host/app?auth=token: urljoin(rtmp://host/app, ?auth=token) I'm not sure adding everybody's custom scheme to a hard-coded whitelist is the best way to do solve this. Below I have identified some other schemes not in the uses_relative list. Is there any reason why one would use urljoin() with them, but want the base URL to be ignored (as is the current behaviour)? I looked at test_urlparse.py and there doesn't seem to be any test cases for these schemes. all = set().union(uses_relative, uses_netloc, uses_params, non_hierarchical, uses_query, uses_fragment) sorted(all.difference(uses_relative)) ['git', 'git+ssh', 'hdl', 'mailto', 'news', 'nfs', 'rsync', 'sip', 'sips', 'snews', 'tel', 'telnet'] Even if the behaviour can't be changed, could the documentation for urljoin() say something like this: Only the following [uses_relative] schemes are allowed in the base URL; any other schemes result in the relative URL being returned without being joined to the base. -- nosy: +vadmium ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18828 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue1500504] Alternate RFC 3986 compliant URI parsing module
Changes by Martin Panter vadmium...@gmail.com: -- nosy: +vadmium ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue1500504 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue13951] Document that Seg Fault in .so called by ctypes causes the interpreter to Seg Fault
Westley Martínez added the comment: Can we have this committed? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue13951 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue14937] IDLE's deficiency in the completion of file names (Python 32, Windows XP)
Westley Martínez added the comment: On Thu, Oct 11, 2012 at 3:06 PM, Terry J. Reedy rep...@bugs.python.org wrote: Terry J. Reedy added the comment: This patch (I suspect it is this one) disabled the use of '/' in filenames on windows when using filename completion. 'c:\ wait, tab, ^space bring up box in 3.2.3 and 3.3.0 (If there is no 'r' prefix, it really should require '\\' to be safe.) +1 for requiring \\. I'll test this tomorrow and report back the behaviour. On Linux it seems that the window only pops up when you press tab. I think behaviour should be as identical on all platforms as possible. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue14937 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18843] Py_FatalError (msg=0x7f0e3b373232 bad leading pad byte) at Python-2.7.5/Python/pythonrun.c:1689
Charles-François Natali added the comment: Just two things: - running under valgrind with the suppression file would help pinpoint this - usually, stack/heap overflows or invalid pointer dereference affect a contiguous chunk of memory: here, there's a *single bit* flipped, not even a byte 0xfb == 0b1011 0xfa == 0b1010 This looks like an hardware issue to me. -- nosy: +neologix ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18843 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue14974] rename packaging.pypi to packaging.index
Westley Martínez added the comment: I think I like the term catalog myself, but I'm not wholly opposed to index. I think it is certainly better than pypi. Although the namespace does reduce the genericness of index, a lot of programmers (including me) like to use the from namespace import x method. I think that's considerable. That said, programmers could use from packaging import index as pindex or some sort to alleviate this. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue14974 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18843] Py_FatalError (msg=0x7f0e3b373232 bad leading pad byte) at Python-2.7.5/Python/pythonrun.c:1689
Martin Mokrejs added the comment: Hi Stephen, I discussed the USE=debug here (https://bugs.gentoo.org/show_bug.cgi?id=482348) and it is denied by portage maintainers because it is not only a debug addition but a whole API change. We have to live with: mkdir -p /etc/portage/env echo 'EXTRA_ECONF=--with-pydebug' /etc/portage/env/pydebug.conf echo dev-lang/python pydebug.conf /etc/portage/package.env The above what I had originally. Yesterday I tried even: # cat /etc/portage/env/pydebug.conf EXTRA_ECONF=--with-pydebug --without-pymalloc --with-valgrind # but I don't know what the valgrind option really does and, whether that means: a) python will run itself under valgrind, don't both doing it yourself b) you don't have to bother with uncommenting the lines in valgrind.supp c) make install will install the valgrind.supp but you have to edit it on your own d) you may use valgrind but don't use other tool replacing malloc(), like electric fence, DUMA, etc. Or some combination of them? :( The Readme.valgrind does not answer this at all. I let DUMA inspect emerge boost run overnight but my computer stopped responding (16GB RAM). I tried only gcc-4.7.3 and python-2.7.5-r2. CFLAGS=-ggdb -pipe -msse -msse2 -msse3 -mssse3 -msse4.1 -msse4.2 -msse4 -mavx -maes -mpclmul -mpopcnt -march=corei7-avx CXXFLAGS=${CFLAGS} Per comment from Charles-François, so you mean that this single-bit change won't be caught by valgrind, right? Why does not memtest86+ detect that? Could python when compiled with the --with-pydebug print also physical, hardware address of the wrong value? That would be really helpful here! Thanks. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18843 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
New submission from Andrew Svetlov: We need public API for getting main thread object. See also http://comments.gmane.org/gmane.comp.python.devel/141370 -- messages: 196521 nosy: asvetlov, haypo, pitrou priority: normal severity: normal status: open title: Add threading.main_thread() function versions: Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18843] Py_FatalError (msg=0x7f0e3b373232 bad leading pad byte) at Python-2.7.5/Python/pythonrun.c:1689
Stefan Krah added the comment: I uderstand that you are building Python using emerge. I would try to get a release from python.org, do a normal build ... make distclean ./configure --prefix=/tmp --with-pydebug --with-valgrind make make install ... then install matplotlib using /tmp/bin/python setup.py ... then run (assuming you are in the build directory): valgrind --suppressions=Misc/valgrind-python.supp /tmp/bin/python your_program -- nosy: +skrah ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18843 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue14130] memoryview: add multi-dimensional indexing and slicing
Stefan Krah added the comment: I would probably work on it (it's basically implemented in _testbuffer.c), but I'm not sure if the NumPy community will actually use the feature. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue14130 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue17145] memoryview(array.array)
Stefan Krah added the comment: The request is certainly valid, but the patch is tricky to review. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue17145 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18874] Add a new tracemalloc module to trace memory allocations
STINNER Victor added the comment: Ok, let's start with a first patch. It works in the common cases, because they are some corner cases like subinterpreter which might still crash. The hook on PyMem_RawMalloc() takes the GIL. It is disabled because it has still bugs (it introduces a deadlock in some corner cases!) -- keywords: +patch Added file: http://bugs.python.org/file31517/tracemalloc.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18874 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18849] Failure to try another name for tempfile when directory with chosen name exists on windows
Changes by Vlad Shcherbina vlad.shcherb...@gmail.com: -- keywords: +patch Added file: http://bugs.python.org/file31518/fix_for_27.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18849 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Changes by Christian Heimes li...@cheimes.de: -- nosy: +christian.heimes ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Christian Heimes added the comment: The function must take care of fork() in worker threads, too. The isinstance(current_thread(), _MainThread) trick may not work. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue16853] add a Selector to the select module
Giampaolo Rodola' added the comment: Follows a couple of minor concerns. - What about Solaris' /dev/poll? - I'm not sure why in case of EINTR you retry with a different timeout; can't you just return []? - this is probably because I'm paranoid about performances but given that select() method will be called repeatedly I would not use a decorator. Also, similarly to what has been done elsewhere in the stdlib, for critical parts I would recommend localizing variable access in order to minimize overhead as in: def select(self, timeout=None): ... key_from_fd = self._key_from_fd ready_append = ready.append for fd in r | w: ... key = key_from_fd(fd) if key: ready_append((key, events key.events)) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue16853 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18780] SystemError when formatting int subclass
Eli Bendersky added the comment: lgtm -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18780 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Changes by Giampaolo Rodola' g.rod...@gmail.com: -- nosy: +giampaolo.rodola ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Antoine Pitrou added the comment: The function must take care of fork() in worker threads, too. The isinstance(current_thread(), _MainThread) trick may not work. Well, there are two possibilities: - main_thread() returns the original _MainThread instance, even if it's dead in the child process - main_thread() returns the main thread of the current process Both are reasonable, but we must settle for one :-) (also, the use case of forking from a thread is really obscure, I don't think we should worry too much about it) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Andrew Svetlov added the comment: Patch with code and tests is attached. Test fails when program forks from thread other than the main one. -- keywords: +patch Added file: http://bugs.python.org/file31519/issue18882.diff ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Andrew Svetlov added the comment: signal module reinitializes main_thread variable in PyOS_AfterFork, threading does nothing with forking. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue17741] event-driven XML parser
Roundup Robot added the comment: New changeset 8fd72b1bb262 by Eli Bendersky in branch 'default': Issue #17741: Rename IncrementalParser and its methods. http://hg.python.org/cpython/rev/8fd72b1bb262 -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue17741 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18852] site.py does not handle readline.__doc__ being None
Changes by Berker Peksag berker.pek...@gmail.com: -- nosy: +berker.peksag ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18852 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue17741] event-driven XML parser
Eli Bendersky added the comment: This issue has become too large and discusses a few different things; hence I'm inclined to close it as fixed and open a new one as a placeholder for discussing the new design of the internals. I'll do this in a couple of days if there are no objections. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue17741 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
New submission from Martin Mokrejs: Looks I cannot compile when python-3.3 was configured with --with-pydebug. I use Gentoo Linux, the -r2 shows they added some patches but should not matter I think. building 'xxlimited' extension x86_64-pc-linux-gnu-gcc -pthread -fPIC -Wno-unused-result -DDYNAMIC_ANNOTATIONS_ENABLED=1 -O2 -pipe -msse -msse2 -msse3 -mssse3 -msse4.1 -msse4.2 -msse4 -mavx -maes -mpclmul -mpopcnt -march=corei7-avx -fstack-protector-all -fwrapv -DPy_LIMITED_API=1 -IInclude -I. -I/mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/Python-3.3.2/Include -I/mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/x86_64-pc-linux-gnu -c /mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/Python-3.3.2/Modules/xxlimited.c -o build/temp.linux-x86_64-3.3-pydebug/mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/Python-3.3.2/Modules/xxlimited.o In file included from /mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/Python-3.3.2/Include/Python.h:68:0, from /mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/Python-3.3.2/Modules/xxlimited.c:17: /mnt/1TB/var/tmp/portage/dev-lang/python-3.3.2-r2/work/Python-3.3.2/Include/object.h:65:2: error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG I think make should ignore this error unless you fix xxlimited.c sources. -- files: build.log messages: 196534 nosy: mmokrejs priority: normal severity: normal status: open title: python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG type: compile error versions: Python 3.3 Added file: http://bugs.python.org/file31520/build.log ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Andrew Svetlov added the comment: http://bugs.python.org/issue16500 is required to make work after fork from thread other than the main one. -- dependencies: +Add an 'atfork' module ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue16853] add a Selector to the select module
Charles-François Natali added the comment: Hello, - What about Solaris' /dev/poll? That should be pretty easy to add by someone who has access to a Solaris box: I could use the buildbots, but it'd take a couple iterations to get it right. - I'm not sure why in case of EINTR you retry with a different timeout; can't you just return []? Because when I do: selector.select(10) I expect the selector to wait 10 seconds if no event occurs: the fact that a signal comes in shouldn't break the contract made by the API, i.e. that it will return when a FD is ready or the timeout expires. Early return can lead to spurious errors: just imagine you send a request to a busy server: it would be bad to raise a timeout error just because the user put the client in the background with CTRL-Z (which results in SIGSTOP). - this is probably because I'm paranoid about performances but given that select() method will be called repeatedly I would not use a decorator. Also, similarly to what has been done elsewhere in the stdlib, for critical parts I would recommend localizing variable access in order to minimize overhead as in: def select(self, timeout=None): ... key_from_fd = self._key_from_fd ready_append = ready.append for fd in r | w: ... key = key_from_fd(fd) if key: ready_append((key, events key.events)) I find that localizing variables leads to unreadable code, and is tied to the current CPython interpreter implementation: such optimizations belong to the interpreter, not user code. As for the decorator performance overhead, I don't think it weights much compared to the cost of a syscall (+ GIL acquire/release): with decorator: $ ./python -m timeit -s from selectors import DefaultSelector, EVENT_WRITE; import os; s = DefaultSelector(); s.register(os.pipe()[1], EVENT_WRITE) s.select() 10 loops, best of 3: 3.69 usec per loop without decorator: $ ./python -m timeit -s from selectors import DefaultSelector, EVENT_WRITE; import os; s = DefaultSelector(); s.register(os.pipe()[1], EVENT_WRITE) s.select() 10 loops, best of 3: 3.52 usec per loop That's a 4% overhead, with a single FD that's always ready (and I suspect that most of the overhead is due to the call to time(), not the decorator per se). Also, I'll shortly propose a patch to handle EINTR within C code, so those EINTR wrappers won't be needed anymore. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue16853 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue16853] add a Selector to the select module
Giampaolo Rodola' added the comment: That should be pretty easy to add by someone who has access to a Solaris box: I could use the buildbots, but it'd take a couple iterations to get it right. I can probably help with that. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue16853 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18884] python-2.7.5-r3: 40 bytes in 1 blocks are definitely lost
New submission from Martin Mokrejs: It is not important why I had in this moment matplotlib not in sync with python itself whcih was configure using --with-pydebug ... but here I just want to show that maybe you do not test for memleaks using valgrind on import errors (maybe include such testcase into you tests). Anyway, here is what I got: ==14007== Memcheck, a memory error detector ==14007== Copyright (C) 2002-2012, and GNU GPL'd, by Julian Seward et al. ==14007== Using Valgrind-3.8.1 and LibVEX; rerun with -h for copyright info ==14007== Command: /usr/bin/python2.7 blah.py ==14007== Traceback (most recent call last): File blah.py, line 288, in module import pylab File /usr/lib64/python2.7/site-packages/pylab.py, line 1, in module from matplotlib.pylab import * File /usr/lib64/python2.7/site-packages/matplotlib/pylab.py, line 222, in module from matplotlib import mpl # pulls in most modules File /usr/lib64/python2.7/site-packages/matplotlib/mpl.py, line 1, in module from matplotlib import artist File /usr/lib64/python2.7/site-packages/matplotlib/artist.py, line 7, in module from transforms import Bbox, IdentityTransform, TransformedBbox, \ File /usr/lib64/python2.7/site-packages/matplotlib/transforms.py, line 35, in module from matplotlib._path import (affine_transform, count_bboxes_overlapping_bbox, ImportError: /usr/lib64/python2.7/site-packages/matplotlib/_path.so: undefined symbol: _PyMem_DebugFree [100070 refs] ==14007== ==14007== HEAP SUMMARY: ==14007== in use at exit: 6,303,492 bytes in 31,921 blocks ==14007== total heap usage: 1,266,299 allocs, 1,234,378 frees, 179,304,947 bytes allocated ==14007== ==14007== 40 bytes in 1 blocks are definitely lost in loss record 167 of 3,515 ==14007==at 0x4C2C63B: malloc (vg_replace_malloc.c:270) ==14007==by 0x4EF1E8C: PyMem_Malloc (object.c:2343) ==14007==by 0x10064848: initialize_builtin_datetime_metadata (arraytypes.c.src:3953) ==14007==by 0x100719E2: set_typeinfo (arraytypes.c.src:4047) ==14007==by 0x1016354E: initmultiarray (multiarraymodule.c:4057) ==14007==by 0x4FB2661: _PyImport_LoadDynamicModule (importdl.c:53) ==14007==by 0x4FAE3A7: load_module (import.c:1915) ==14007==by 0x4FB07C1: import_submodule (import.c:2700) ==14007==by 0x4FAFCC6: load_next (import.c:2515) ==14007==by 0x4FAED8B: import_module_level (import.c:2224) ==14007==by 0x4FAF320: PyImport_ImportModuleLevel (import.c:2288) ==14007==by 0x4F78862: builtin___import__ (bltinmodule.c:49) -- components: Interpreter Core messages: 196538 nosy: mmokrejs priority: normal severity: normal status: open title: python-2.7.5-r3: 40 bytes in 1 blocks are definitely lost versions: Python 2.7 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18884 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18720] Switch suitable constants in the socket module to IntEnum
Eli Bendersky added the comment: [Thanks, Guido] Attaching patch with SOCK_* constants converted as well. The module globals are currently walked twice, once to extract AF_* and once SOCK_*. This is not a real performance problem, but should we fold it into a single loop that collects two dicts? The code will be less pretty but more efficient. Another issue is _intenum_converter. Ethan - do you think it belongs in the enum module as a helper function or something of the sort? -- Added file: http://bugs.python.org/file31521/socket-intenum-af-type.6.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18720 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18882] Add threading.main_thread() function
Antoine Pitrou added the comment: http://bugs.python.org/issue16500 is required to make work after fork from thread other than the main one. No it isn't. Please take a look at _after_fork() in threading.py. -- dependencies: -Add an 'atfork' module ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18882 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18884] python-2.7.5-r3: 40 bytes in 1 blocks are definitely lost
Changes by Ezio Melotti ezio.melo...@gmail.com: -- nosy: +haypo, pitrou, skrah type: - resource usage ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18884 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Stefan Krah added the comment: A similar issue was closed, see msg157249. The error looks deliberate to me, so let's close this, too. -- nosy: +skrah resolution: - works for me status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18884] python-2.7.5-r3: 40 bytes in 1 blocks are definitely lost
Changes by Antoine Pitrou pit...@free.fr: -- priority: normal - low ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18884 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
R. David Murray added the comment: I'm curious how this error gets triggered. I build python --with-pydebug on Gentoo all the time, albeit from a checkout, and I've never seen it. I'm imagining that means it is a Gentoo bug. Well, not even really a bug, since Gentoo doesn't itself support emerging python in debug mode. -- nosy: +r.david.murray ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Martin Mokrejs added the comment: See for what I did to Gentoo: http://bugs.python.org/issue18843#msg196520 -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Martin Mokrejs added the comment: Uh. I don't understand. So did you want to say I should not run configure --with-pydebug in python 3.3 or what? I am fine if you fix the Makefile not to exit on this particular file. I am missing something. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue14974] rename packaging.pypi to packaging.index
Éric Araujo added the comment: The issue is moot now that packaging/distutils2 is stopped. -- resolution: - out of date stage: needs patch - committed/rejected status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue14974 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue14914] pysetup installed distribute despite dry run option being specified
Éric Araujo added the comment: Contrary to other issues that are relevant to distlib/pip/others, this one can be closed. -- resolution: - out of date stage: test needed - committed/rejected status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue14914 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Stefan Krah added the comment: Martin, msg196534 shows that you are building with -DPy_LIMITED_API=1. You can either use the limited API or --with-pydebug, but not both. [As I said in the other issue, IMHO it is better to use a minimal set of build options when reporting bugs.] -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18859] README.valgrind should mention --with-valgrind
Martin Mokrejs added the comment: Moreover, it should explain what that really does. One could think of several answers or even their combinations what this configure flag will really do: a) python will run itself under valgrind, don't bother ever doing it yourself b) you don't have to bother with uncommenting the lines in valgrind.supp c) make install will install the valgrind.supp but you have to edit it on your own d) you may use valgrind but don't use other tool replacing malloc(), like electric fence, DUMA, etc. You should be quite verbose how (in)compatible is this with other tools. Or some combination of them? valgrind docs say it won't work if a binary lacks debug symbols (wasn't compiled with -g). Initially I got mislead I though I have to convert my blah.py to blah.c using cython, compile that with gcc -ggdb blah.c and then I may run valgrind on the binary. In the end, I don't understand why everybody has to remove the comment symbols from the valgrind-python.supp file at all. Why isn't that by default enabled? I also suggest you mention right in the file other handy information because people quite likely get on this path while chasing memory corruption issues, maybe broken hardware: 1. Mention that python uses 256kB chunks by default. 2. Mention there exists --without-pymalloc as Tim explained me in http://bugs.python.org/issue18843#msg196492 3. Explain how to interpret those stacktraces one could get: http://bugs.python.org/issue18843#msg196481 Thank you! -- nosy: +mmokrejs ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18859 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Martin Mokrejs added the comment: Hmm, but I did not add -DPy_LIMITED_API=1. Python 2.7.5 can be compiled using same configuration. Going back to the build.log file I see: configure --prefix=/usr --build=x86_64-pc-linux-gnu --host=x86_64-pc-linux-gnu --mandir=/usr/share/man --infodir=/usr/share/info --datadir=/usr/share --sysconfdir=/etc --localstatedir=/var/lib --libdir=/usr/lib64 --with-fpectl --enable-shared --disable-ipv6 --with-threads --infodir=${prefix}/share/info --mandir=${prefix}/share/man --with-computed-gotos --with-dbmliborder=gdbm --with-libc= --enable-loadable-sqlite-extensions --with-system-expat --with-system-ffi --with-pydebug --without-pymalloc --with-valgrind I specified only --with-pydebug --without-pymalloc --with-valgrind. So where does the limited API come from? Bug in configure.ac? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18720] Switch suitable constants in the socket module to IntEnum
Guido van Rossum added the comment: I prefer two prettier loops over one less pretty one in this case. No opinion about _intenum_converter yet. (IMO refactoring can always be lazy, i.e. after you have multiple copies of the same code there's time to consider whether you should unify them and what the pros and cons are.) On Fri, Aug 30, 2013 at 6:55 AM, Eli Bendersky rep...@bugs.python.orgwrote: Eli Bendersky added the comment: [Thanks, Guido] Attaching patch with SOCK_* constants converted as well. The module globals are currently walked twice, once to extract AF_* and once SOCK_*. This is not a real performance problem, but should we fold it into a single loop that collects two dicts? The code will be less pretty but more efficient. Another issue is _intenum_converter. Ethan - do you think it belongs in the enum module as a helper function or something of the sort? -- Added file: http://bugs.python.org/file31521/socket-intenum-af-type.6.patch ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18720 ___ -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18720 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Stefan Krah added the comment: I think I understand now: If you used the strategy from msg196520, of course you get the Gentoo flags. What you really should do is download a release or get a checkout from hg.python.org and build that _without_ using emerge. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18844] allow weights in random.choice
Mark Dickinson added the comment: [Madison May] - Should negative weights cause a ValueError to be raised, or should they be converted to 0s? - Should passing a list full of zeros as the weights arg raise a ValueError or be treated as if no weights arg was passed? Both those seem like clear error conditions to me, though I think it would be fine if the second condition produced a ZeroDivisionError rather than a ValueError. I'm not 100% sold on the feature request. For one thing, the direct implementation is going to be inefficient for repeated sampling, building the table of cumulative sums each time random.choice is called. A more efficient approach for many use-cases would do the precomputation once, returning some kind of 'distribution' object from which samples can be generated. (Walker's aliasing method is one route for doing this efficiently, though there are others.) I agree that this is a commonly needed and commonly requested operation; I'm just not convinced either that an efficient implementation fits well into the random module, or that it makes sense to add an inefficient implementation. -- nosy: +tim.peters ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18844 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18885] handle EINTR in the stdlib
New submission from Charles-François Natali: As discussed in http://mail.python.org/pipermail/python-dev/2013-August/128204.html, I think that we shouldn't let EINTR leak to Python code: it should be handled properly by the C code, so that users (and the Python part of the stdlib) don't have to worry about this low-level historical nuisance. For code that doesn't release the GIL, we could simply use this glibc macro: # define TEMP_FAILURE_RETRY(expression) \ (__extension__ \ ({ long int __result; \ do __result = (long int) (expression); \ while (__result == -1L errno == EINTR); \ __result; })) #endif Now, I'm not sure about how to best handle this for code that releases the GIL. Basically: Py_BEGIN_ALLOW_THREADS pid = waitpid(pid, status, options); Py_END_ALLOW_THREADS should become begin_handle_eintr: Py_BEGIN_ALLOW_THREADS pid = waitpid(pid, status, options); Py_END_ALLOW_THREADS if (pid 0 errno == EINTR) { if (PyErr_CheckSignals()) return NULL; goto begin_handle_eintr; } Should we do this with a macro? If yes, should it be a new one that should be placed around Py_BEGIN_ALLOW_THREADS/Py_END_ALLOW_THREADS (like BEGIN_SELECT_LOOP in selectmodule.c) or could we have a single macro that would do both (i.e. release the GIL / reacquire the GIL, and try again in case of EINTR, unless a signal handler raised an exception)? From a cursory look, the main files affected would be: Modules/fcntlmodule.c Modules/ossaudiodev.c Modules/posixmodule.c Modules/selectmodule.c Modules/selectmodule.c Modules/signalmodule.c Modules/socketmodule.c Modules/syslogmodule.c -- messages: 196555 nosy: neologix priority: normal severity: normal stage: needs patch status: open title: handle EINTR in the stdlib type: enhancement versions: Python 3.4 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18885 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18843] Py_FatalError (msg=0x7f0e3b373232 bad leading pad byte) at Python-2.7.5/Python/pythonrun.c:1689
Charles-François Natali added the comment: 2013/8/30 Martin Mokrejs rep...@bugs.python.org: Per comment from Charles-François, so you mean that this single-bit change won't be caught by valgrind, right? Why does not memtest86+ detect that? Could python when compiled with the --with-pydebug print also physical, hardware address of the wrong value? That would be really helpful here! Thanks. I mean that in the vast majority of cases, a single bit flip is due to a hardware error. That can be due to faulty RAM, electric noise, cosmic rays... Software-induced memory corruptions generally corrupt at least a byte. Do you reproduce the crash systematically, or is it random? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18843 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18885] handle EINTR in the stdlib
Changes by Charles-François Natali cf.nat...@gmail.com: -- nosy: +haypo, pitrou, sbt ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18885 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
R. David Murray added the comment: Python2 doesn't support the limited ABI, so that flag is a noop for 2.7. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Martin Mokrejs added the comment: So I conclude that you want to say that some of the configure flags is wrong? Which? I can surely report that at Gentoo. I still think Makefile should be changed so that it make does not even try to compile xxlimited.c if -DPy_LIMITED_API=1 is in CFLAGS regardless how that happened. http://sources.gentoo.org/cgi-bin/viewvc.cgi/gentoo-x86/dev-lang/python/ Look into the python-3.3.2-r2.ebuild file for what it does, and how does differ from 3.7.5-r2.ebuild which can be compiled fine through emerge abd thsoe 3 configure arguments I requested. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18883] python-3.3.2-r2: Modules/xxlimited.c:17:error: #error Py_LIMITED_API is incompatible with Py_DEBUG, Py_TRACE_REFS, and Py_REF_DEBUG
Stefan Krah added the comment: Well, these look like Gentoo build flags. Did you or emerge or anything else export CFLAGS in the shell? -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18883 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18884] python-2.7.5-r3: 40 bytes in 1 blocks are definitely lost
Martin Mokrejs added the comment: Why do you think so? My point is that this happens when import fails. But python is at fault and should handle import errors. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18884 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18884] python-2.7.5-r3: 40 bytes in 1 blocks are definitely lost
STINNER Victor added the comment: According to traces, the leak does not come from Python but matplotlib. Please report the issue to matplotlib bug tracker. -- resolution: - invalid status: open - closed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18884 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18438] Obsolete url in comment inside decimal module
Stefan Krah added the comment: Wikipedia sounds good. Let's avoid linking directly to free versions. :) -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18438 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue18852] site.py does not handle readline.__doc__ being None
Berker Peksag added the comment: Here's a patch with a comment about using pyreadline on Windows. Should the note in the readline documentation[1] be updated? [1] http://docs.python.org/3.4/library/readline.html (See also issue 5845 and msg123703) -- keywords: +patch stage: needs patch - patch review Added file: http://bugs.python.org/file31522/issue18852.diff ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue18852 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com