Re: Pickling/unpickling top-level functions, classes etc.
OK, I'm +1 on the Won't fix status.I'm not proficient enough in the way unpickling works, but the fact that you cannot specify any namespace in which classes or functions names have to be resolved makes it quite clear that dynamic imports + unpickling functions and classes = not possible. Regards,Nicolas2006/3/29, Deron Meranda [EMAIL PROTECTED]: On 3/29/06, Graham Dumpleton [EMAIL PROTECTED] wrote: Are you okay with: http://issues.apache.org/jira/browse/MODPYTHON-81 Pickling/unpickling top-level functions defined in published module no longer works in mod_python 3.2 being resolved as Won't Fix and then closed?I agree that this seems to be something that is just not solvable without causing complete havoc with all thespecialized import and reload functionality, or resultingin a solution that is too fragile.It is just a limitation ofthe pickle mechanism.This of course doesn't mean that users wouldn't want to pickle these kinds of things.But that the burden in thosecases should be on them.It may be possible to solvethis for class instances (e.g., objects) by subclassing theUnpickler class and substituting a smarter find_class() method.But as for globals, functions, etc., it looks likeit may be much harder.The user may also be able to take advantage of theexternal object pickling (with persistent ids), but Ihaven't looked at them too closely. Regardless, there are lots of alternatives, so I haveno problem with mod_python not solving this one(although the mod_python documentation shouldclearly emphasize these pickling limitiations).-- Deron Meranda
[jira] Resolved: (MODPYTHON-81) Pickling/unpickling top-level functions defined in published module no longer works in mod_python 3.2
[ http://issues.apache.org/jira/browse/MODPYTHON-81?page=all ] Graham Dumpleton resolved MODPYTHON-81: --- Fix Version: (was: 3.3) Resolution: Won't Fix Pickling/unpickling top-level functions defined in published module no longer works in mod_python 3.2 -- Key: MODPYTHON-81 URL: http://issues.apache.org/jira/browse/MODPYTHON-81 Project: mod_python Type: Bug Components: importer Versions: 3.2.7 Reporter: Nicolas Lehuen See http://modpython.org/pipermail/mod_python/2005-October/019158.html The problem is that pickling/unpickling top-level function is done by name, which requires that the module they are defined in can be imported in the usual way, or at least that it is registered in sys.modules. Fixing this in 3.2 alone seems quite difficult. I'd rather try to do this in 3.3, along with the major overhaul of the import system. -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] Closed: (MODPYTHON-81) Pickling/unpickling top-level functions defined in published module no longer works in mod_python 3.2
[ http://issues.apache.org/jira/browse/MODPYTHON-81?page=all ] Graham Dumpleton closed MODPYTHON-81: - Pickling/unpickling top-level functions defined in published module no longer works in mod_python 3.2 -- Key: MODPYTHON-81 URL: http://issues.apache.org/jira/browse/MODPYTHON-81 Project: mod_python Type: Bug Components: importer Versions: 3.2.7 Reporter: Nicolas Lehuen See http://modpython.org/pipermail/mod_python/2005-October/019158.html The problem is that pickling/unpickling top-level function is done by name, which requires that the module they are defined in can be imported in the usual way, or at least that it is registered in sys.modules. Fixing this in 3.2 alone seems quite difficult. I'd rather try to do this in 3.3, along with the major overhaul of the import system. -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira
Re: Pickling/unpickling top-level functions, classes etc.
Graham Dumpleton wrote: Nicolas Are you okay with: http://issues.apache.org/jira/browse/MODPYTHON-81 Pickling/unpickling top-level functions defined in published module no longer works in mod_python 3.2 being resolved as Won't Fix and then closed? As I describe in: http://www.dscpl.com.au/articles/modpython-005.html there are going to be various issues with pickling with any new importer which doesn't keep stuff in sys.modules. I don't see any solution for the issue as far as modules managed by the mod_python importer. Simply means that if you want to pickle stuff like that, has to be in module on standard sys.path and managed by normal Python module import system. Anyone else want to comment? +1 for Won't Fix Then it's pretty easy for us to just say don't do it. We could offer a sub-optimal soltion and tell people it might work, only to waste alot of time on the mailing list explaining why it doesn't. Jim
Re: Auto updating of req.finfo when req.filename changed.
On Sun, 26 Mar 2006, Graham Dumpleton wrote: One use for it that I already have is to get around the DirectoryIndex problems in mod_python caused by Apache's use of the ap_internal_fast_redirect() function to implement that feature. The specifics of this particular issue are documented under: http://issues.apache.org/jira/browse/MODPYTHON-146 Could we zoom in this a little bit. I've read the description, but not quite sure I understand it quite yet. Is the problem that if I set req.notes['foo'] = 'bar' in a phase prior to fixup, by the time we get to the content handler, it will be gone because notes would be overwritten by mod_dir? Grisha
Re: Auto updating of req.finfo when req.filename changed.
Grisha wrote .. On Sun, 26 Mar 2006, Graham Dumpleton wrote: One use for it that I already have is to get around the DirectoryIndex problems in mod_python caused by Apache's use of the ap_internal_fast_redirect() function to implement that feature. The specifics of this particular issue are documented under: http://issues.apache.org/jira/browse/MODPYTHON-146 Could we zoom in this a little bit. I've read the description, but not quite sure I understand it quite yet. Is the problem that if I set req.notes['foo'] = 'bar' in a phase prior to fixup, by the time we get to the content handler, it will be gone because notes would be overwritten by mod_dir? Fixup phase or earlier actually. In the case of req.notes though, it isn't that the value in req.notes vanishes, it is that it gets duplicated. Consider .htaccess file containing: AddHandler mod_python .py PythonHandler mod_python.publisher PythonDebug On DirectoryIndex index.py PythonFixupHandler _fixup In _fixup.py in the same directory, have: from mod_python import apache import time def fixuphandler(req): time.sleep(0.1) req.notes['time'] = str(time.time()) return apache.OK In index.py have: def index(req): return req.notes['time'] When I use a URL: http://localhost:8080/~grahamd/fast_redirect/index.py the result I get is: 1143667522.23 Ie., a single float value holding the time the request was made. If I now instead access the directory using the URL: http://localhost:8080/~grahamd/fast_redirect/ I instead get: ['1143667680.57', '1143667680.47'] In other words, instead of getting the single value I now get two values contained in a list. It wouldn't matter if the the two values were the same they would both still be included. Where a content handler was expecting a single string value, it would die when it gets a list. What is happening is that when the request is made against the directory it runs through the phases up to and including the fixup handler phase. As a consequence it runs _fixup::fixuphandler() with req.notes['time'] being set to be the time at that point. At the end of the fixup phase a mod_dir handler kicks in and it sees that the file type of request_rec-filename as indicated by request_rec-finfo-filetype is APR_DIR. As a consequence it will apply the DirectoryIndex directive, looping through listed files to find a candidate it can redirect the request too. In finding a candidate it reapplies phases up to and including the fixup handler phase on the new candidate filename. This is done so that access and authorisation checks etc are still performed on the candidate file. Because it has run the fixup handlers on the candidate file, the _fixup::fixuphandler() will be run again. This results in req.notes being set. At that stage the req.notes is separate as it is in effect run as a sub request to the main request against the directory. If after checking through the candidates it finds one that matches, to avoid having to run phases up to and including the fixup handler phase on the candidate again, mod_dir tries to fake a redirect. This is what ap_internal_fast_redirect() is being used for. What the method does is to copy details from the request_rec structure of the sub request for the candidate into the request_rec of the main request. When the mod_dir fixup handler returns, the main request then continues on to execute the content handler phase, with the details of the sub request. The problem with this is that rather than simply using req.notes from the sub request, or overlapping the contents from the sub request onto that of the main request, it merges them together. You therefore end up with multiple entries for the 'time' value which was added. To emphasise the problem, change the fixup handler to be: from mod_python import apache def fixuphandler(req): req.notes['filename'] = req.filename return apache.OK and index.py to: def index(req): return req.notes['filename'] The result when using URL against the directory is used is: ['/Users/grahamd/Sites/fast_redirect/index.py', '/Users/grahamd/Sites/fast_redirect/'] Now it isn't just req.notes that is going to see this merging as the code in ap_internal_fast_redirect() is: r-notes = apr_table_overlay(r-pool, rr-notes, r-notes); r-headers_out = apr_table_overlay(r-pool, rr-headers_out, r-headers_out); r-err_headers_out = apr_table_overlay(r-pool, rr-err_headers_out, r-err_headers_out); r-subprocess_env = apr_table_overlay(r-pool, rr-subprocess_env, r-subprocess_env); Thus, it also merges output headers and subprocess environment variables. The merging of these could in themselves also cause problems. This isn't the end of the problems though as ap_internal_fast_redirect() doesn't do anything with: /** Notes on
Re: PythonImport that works for any interpreter.
Graham Dumpleton wrote: In: http://issues.apache.org/jira/browse/MODPYTHON-117 I describe the idea of having a means of using PythonImport to define a module to be imported into any interpreter that may be created. For some cases where there are a lot of virtual hosts, this may be simpler than having to list a directive for every virtual host explicitly. Is there any interest in such a feature? If of interest, for a simple implementation, the only issue is one of ordering when for an interpreter there are both imports for all interpreters and an interpreter specific imports. Does one import the module specified to be imported in all interpreters first before the interpreter specific ones or vice versa. My feeling has been that the modules to be imported in all intrepreters should be done first. Feedback? Am I wasting my time implementing this one? That's entirely up to you. It's not a feature that I need, so don't do it on my account. ;) Jim
Grouping tests (was: Latest tests)
What I have been doing in a totally unrelated Python project is to create test groups simply by putting them into separate modules. The main test module test.py looks like this: ## (test.py) import unittest from test_something import * from test_someother import * from test_yetmore import * if __name__ == '__main__': unittest.main(module=__import__(__name__)) ## This works because unittest takes all 'things' that start with 'test' in the provided module and runs them. So anything we bring into our namespace gets run. This also makes it possible to import tests from other projects, and share these tests between projects. The other test_ modules look much alike: ## (test_something.py) import unittest import test_peer class test04MultiPeerSystem(test_peer.BaseTestRealThing): def test03DiscoveryC08B20(self): Multiple clients w/ discovery: 8 peers, 20 blocks each self.runDiscoveryTest(nclients = 8, nblocks=20) if __name__ == '__main__': unittest.main(module=__import__(__name__)) ## This makes it very easy to handle test subsets, and run single test suites. Just run $python test.py to run ALL the tests. To run just a single set, run $python test_something.py And to run a single test, either of these will do: $python test_something.py test04MultiPeerSystem $python test.py test04MultiPeerSystem The real power shows when you want to run 4 or 5 test sets, and/or only parts of some test sets. Just create a new main test unit that imports the desired ones, and you're set: ## (test_few.py) import unittest from test_something import * from test_someother import TestOnlyThis if __name__ == '__main__': unittest.main(module=__import__(__name__)) ## Because some tests take very long to run (in my vocabulary, long means more than a second), this saves me a lot of time when working on a part of a big project, where I don't need to run all tests all the time. -- Mike Looijmans Philips Natlab / Topic Automation Jim Gallacher wrote: ... I've been playing with some ideas for a new test framework, using a subclass of unittest.TestLoader to find and configure tests. I want to play around with it for another day or so before sharing but at this point I'm pretty confident it'll work. Creating a new set of tests could be as simple as: testsuites/core/simpletest.py from testlib import VirtualHostTest class MyTests(VirtualHostTest): def test_hello_world(self): rsp = self.send_request() self.failUnless(rsp == 'test ok') def test_goodbye_world(self): rsp = self.send_request() self.failUnless(rsp == 'test ok') htdocs/testhandlers/core/simpletest.py -- from mod_python import def test_hello_world(req): req.write('test ok') return apache.OK def test_goodbye_world(req): req.write('test ok') return apache.OK $ python testrunner.py Things like virtual host names and handler directives required for configuration or send_request() are automatically derived from the test class and test method names. It will still be possible to provide custom apache configuration directives in a manner similar to that which we currently use, but for most tests this will not be required. Usage would look something like this: Run all the tests $ python testrunner.py Run one test $ python testrunner.py -t core.simpletest.MyTests.test_hello_world Run a group of tests (this would load the TestCase subclasses in testsuites/sessions/filesession.py): $ python testrunner.py -t sessions.filesession Jim