Michael Bayer wrote:
There are currently two ways to construct SQL expressions with SQLAlchemy without any knowledge of a database connection or implementation.

One is to use the ProxyEngine, which receives a connect() call when you are ready to connect it to something, which occurs only within the scope of the current thread:

    e = ProxyEngine()
    s = Table('mytable', e, Column()...).select()

    e.connect('someengine://')
    s.execute()

    e.connect('someotherengine://')
    s.execute()

This is the way everyone who needs their app to run before a database is defined has been doing it.

This is similar to ConnectionHub in SQLObject, though that is bound to classes and not SQL expressions. You can instantiate it more times to group sets of things together, though a default singleton is provided as well (sqlobject.sqlhub), again similar to the default ProxyEngine.

Threadlocal proxies cause problems for me in many situations, so I was inclined to move to something more explicit. So there's instances, but you call a method to get a concrete connection object. This adds some typing overhead, but I do like the explicitness of it. That it uses a threadlocal underneath is a detail you don't need to understand, but threadlocal proxies can be really tricky.

Also, if you create a Table without an explicit engine, it uses a "default" ProxyEngine that you can later connect to databases via the "global_connect()" function, so you can also build up your tables without knowing anything about engines at all.

The other way, which is a little more "underground" and is currently used just in some of the unit tests, is to construct using TableClause and ColumnClause objects, which are the "lexical" component to Table and Column objects:

    t = table('mytable', column('col1'), column('col2'))
    s = t.select()

    print s  # will print the select query with a default compiler

    # compile 's' against a SQLite engine
    e = create_engine('sqlite://')
    c = s.compile(e)

    # or compile 's' against SQLite's compiler
    c = sqlalchemy.databases.sqlite.SQLiteCompiler(s)

    # execute
    c.execute()

    # compile 's' against postgres:
    e = create_engine('postgres://')
    c = s.compile(e)
    # execute
    c.execute()

This was a refactoring I came up with after reading the blog post on SQL-API, to cleanly separate the concepts of "lexical" and "schema" constructs. TableClause and ColumnClause, since they arent related to any kind of "schema", dont have any "unique-per-tablename" type behavior, and also dont have the benefit of column types or foreign keys (this might change at some point). I have yet to work out a full set of patterns which can "upgrade" these lexical objects to "schema-aware" objects in a straightforward way.

This is much more like what I'm doing in SQL-API.  You do something like:

>>> t = sqlapi.sql.Table('mytable')
>>> s = sqlapi.sql.Select([t['foo'], t['bar']], t['foo'] > 0)
# Abstract output, suitable for logging or debugging:
>>> print sqlapi.sql.sqlrepr(s)
SELECT mytable.foo, mytable.bar FROM mytable WHERE (mytable.foo > 0)
# Also abstract, using a dummy connection (the dummy is
# paramstyle=qmark, for instance):
>>> print sqlapi.sql.sqlparams(s)
('SELECT mytable.foo, mytable.bar FROM mytable WHERE (mytable.foo > ?)', (0,))
# this is clearly a bit roundabout way to instantiate:
>>> wrapper = sqlapi.connect.wrapper.ConnectionWrapper.from_uri('sqlite:/:cwd:/foo.db')
>>> wrapper.execute(s)

sqlapi.sql is just syntax abstraction, nothing higher level. It doesn't know what columns a table has, and certainly never what types columns are. I'm doing that inside the SQLObject class. One bit of introspection that does exist is determining what tables are in an expression, so that FROM can be automatically determined. I find this makes joins fairly easy to abstract.

Additionally, SQL expressions are passed the object described in the interfaces file (http://sqlobject.org/sqlapi/module-sqlapi.interfaces.html) and are supposed to use that to translate pieces as appropriate. So "standard" SQL (or potentially some superset) is used as a base, and then backends can translate them. This includes things like translating the parameters, so you'd use datetime.datetime everywhere and backends that didn't like that type would translate it to whatever they did like.

I'm not terribly happy about how that second part works, though. But one positive part is that the portability layer doesn't need to know anything about the SQL abstraction layer.

--
Ian Bicking  /  [EMAIL PROTECTED]  /  http://blog.ianbicking.org


-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
Sqlalchemy-users mailing list
Sqlalchemy-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sqlalchemy-users

Reply via email to