Right, this would be a global "type policy" object that the Dialects would consult to determine the appropriate conversion to make. Right now that's done by convention, this would make things more explicit and allow global overrides of type conversion policy.

It makes sense to leave the actual conversion to the Dialects themselves, because there may be some db-specific hacks needed, or it may be possible for some DBAPI modules to be programmed to automatically convert according to the policy (I think that's possible for psycopg's adaption, for instance). Maybe the policy object could expose a library of standard conversion routines to keep things more or less standardized between the Dialects.

As to would go in the Dialect and what would go in the global Policy object, I guess that's going to determined. I would suggest a good rule of thumb is to keep the really DB-specific stuff in the Dialect, and reserve the global object for these "across the board" type of things.

Rick



Right, I would say the data conversion itself should stay in the Dialect, what the Dialects need is a way to consult a global object containing a map of the

On 6/4/06, Michael Bayer <[EMAIL PROTECTED]> wrote:
you can override types at the column level, even if you use reflection:

table = Table('sometable', meta, 
Column('somecolumn', MyType),
autoload = True
)

which will reflect every column except "somecolumn".  so we have that right now.

I usually see the Dialect as the "thing between the python and database worlds".  its where you determine the exact "language" youre going to speak with the database, and it has options on its behavior.  and for this typing thing, i think we need a way to specify behaviors "across the board" in addition to per-column, since otherwise people have to have expicit types all over the place and it dilutes the usefulness of reflection.

in reality, type-based decisions are always ultimately at the "metadata" level since its determined by TypeEngine objects attached to Column objects.  but they just need some place to get global information about what desired behavior they should have.

that global info could be in an actual MetaData object but then how do we decide what options go at the MetaData level and which ones go in the Dialect (i.e. create_engine) ?   users would have to know which options go where, which might seem kind of arbitrary.  also i would worry MetaData is just going to become another Engine again and have database behavior undesrieably bound to a schema description.


On Jun 3, 2006, at 11:15 PM, Rick Morrison wrote:

I'm feeling it more on the MetaData level.

You're looking at two interfaces that any SA user is going to be concerned with:

the first is at the database level...those types will be necessarily fixed, by either an existing schema or by the type support or restrictions of the database itself.

the second is at the resultset / mapped object level. This is where the user is going to want some control over data type, to fit into their idea of how to work with the data.

What's between these two worlds is the MetaData. Right now, the conversion is implicit and controlled by the DB engine (or maybe more exactly, by the DBAPI interface). But during that conversion, the metadata for that column is available, right? If instead of the implicit conversion, what if there were some sort of pluggable / adaption mechanism that used reasonable defaults, but allowed some sort of override mechanism?

This way, whether you fetched a row from PG, or Sqlite, or whatever, you know that you could morph it into what you need.

Rick


On 6/3/06, Michael Bayer <[EMAIL PROTECTED] > wrote:
right unless they do the "override column type" thing.  which is inconvenient if they want Decimal all over the place.

im thinking that with this, along with the various unicode requests we've had, that some real engine/dialect level "this is the official map of types I'd like you to use" type of thing needs to be created, which is more generic than having flags like "convert_unicode" and such.  but then it would also have to be easy.  not sure.  heres some ideas:

explicit keywords:

x = create_engine(url, string_type=Unicode, numeric_type=Decimal)

translate python types to SQL types (i think this is problematic):

x = create_engine(url, types = {str : Unicode, float : Decimal})

translate generic SA types to more specific SA types:
x = create_engine(url, types = {String:Unicode, Numeric:Decimal})

just list out the more specific types, which "know" how to replace their more generic types, i sort of like this one but not sure if it works:

x = create_engine(url, use_types = [Unicode, Decimal])




On Jun 3, 2006, at 4:55 PM, Rick Morrison wrote:

Or maybe a different column type so that user can control what comes back. Course that wouldn't work with table reflection....

Rick


On 6/3/06, Michael Bayer < [EMAIL PROTECTED]> wrote:
for this one,  i would think "float" would be the default.  then what do we do for people who specifically want a Decimal ?  sounds like another flag-like situation like "convert_unicode".

On Jun 3, 2006, at 2:16 PM, Rick Morrison wrote:


On 6/3/06, Michael Bayer < [EMAIL PROTECTED]> wrote:

The Numeric types in SQLAlchemy dont modify the incoming or outgoing
return type of bind parameters at all.  

Maybe they should. Letting the DBAPI driver "do what it wants" for type conversion breaks the database independence that SA purports to provide, no?

Rick








_______________________________________________
Sqlalchemy-users mailing list
Sqlalchemy-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sqlalchemy-users

Reply via email to