Another very useful video - thanks for putting these out! Clever use of
decorators for template support.
While we're on the topic of using web2py features outside of web2py:
I have a package with library code, command line scripts, etc. -- let's call
it "mypackage"
Let's say I also have a web2py app "myapp". (the creative juices are really
flowing this morning!)
I want to be able to use the same DAL definition from:
- myapp web2py controllers
- mypackage command line scripts
- mypackage library functions that may be called by either web2py or the
command line scripts
Are there any generally accepted "best" methods for organizing something
like this? I can think of the following general approaches:
1. - Define my DAL in myapp's models/db.py, as usual
- Within mypackage code, do "from web2py.applications.myapp.models.db
import db".
This has the following issues:
(a.) I need to lug around my website code on machines where I want
mypackage, even if I don't want to run the website
(b.) If models/db.py is imported from outside web2py (e.g. a command line
script), the contents only get evaluated once per execution. So I would only
ever have a reference to one DB connection, which I think will cause
problems if it is used from multiple threads. (Is this still correct?)
(c.) Minor: I need to add "from gluon.dal import DAL, Field" to the top of
db.py in models. Otherwise, imports from outside web2py will fail since
db.py isn't being evaluated in web2py's context of preloaded gluon imports.
I don't think this causes any problems, does it? It's just redundant when
run within web2py?
2. A variant of #1:
myapp/models/db.py:
def get_db():
db = DAL(...) # Put this in a function, so it can be evaluated
multiple times at runtime
db.define_table(...)
return db
myapp/models/db_load.py: # Gets loaded by web2py after db.py?
db = get_db() # We don't want this in db.py, or we'll have an extra
connection hanging around in mypackage code
mypackage/my-script.py:
from web2py.applications.myapp.models.db import get_db
def threadedRoutine():
db = get_db()
do_stuff_within_thread(db)
I think this takes care of problem 1.b above.
3. Define my DAL in mypackage, and import from there in the web2py models.
For example:
mypackage/db.py:
db = DAL(...)
db.define_table(...)
myapp/models/db.py:
from mypackage.db import db # warning - db only built once per server
lifetime?
This has the following problems;
- If I make changes to the DAL definition under mypackage/db.py, I think
I'll need to restart the web2py server -- these changes won't be picked up
automatically since they are in an outside module, right?
- If the module is only evaluated once per execution, as with #1.b. you
again have the problem of only one DAL instance, and this will cause
problems when multiple connections/threads are involved. But this time the
problem extends to web2py, not just mypackage! Am I correct in saying this,
or does DAL have any magic that allow multiple threads to work via a single
DAL reference?
But at least I don't need to carry my website code everywhere!
4.Like #3, define the DAL in mypackage and import into web2py models. But
construct and return the database within a get_db() function, as with #2.
Call get_db() from myapp/models/db.py to load a new DAL on each pageview,
and from any mypackage thread that needs to use the db.
I think this is pretty good, except that you still need to restart web2py
when making changes to the DAL. Maybe if models/db.py looked something like:
import mypackage.db # contents are normally evaluated once per execution
reload(mypackage.db) # force mypackage.db to be re-read each time
models/db.py is run
db = mypackage.db.get_db()
it would work, but I haven't tested it yet.
If anybody out there has successfully organized a setup like this or could
offer any suggestions, I'd greatly appreciate it!
Thanks,
Kevin