Hi,

It is my understanding that when one adds an object instance to a
session, SQLAlchemy checks for the same object based on it's primary
key. This becomes apparent when one adds the same instance twice,
since the second flush() does not execute any SQL.

My issue with this occurs when scattered throughout my code two object
instances are filled with the same values: from a SQL point of view
this would be the same object (using unique constraints that is), but
from a Python point of view these are two different objects
(instances). In SQL we guard against duplicates through the use of
unique constraints (and/or primary keys), but SQLAlchemy only seem to
check for the existence of _instance_key using the identity map. Does
this mean SQLAlchemy has no knowledge of the unique constraints in the
table definition? And if so, is there a way to change this behavior,
without querying the database? I was expecting to find something along
the line of merge(), or save_or_update(), perhaps in combination with
a custom __cmp__ (like) function in the class?

Consider the following (over)simplified example:

from sqlalchemy import *
from sqlalchemy.orm import *

engine = create_engine('sqlite:///:memory:', echo=True)
metadata = MetaData()

Session = sessionmaker(autoflush=False, bind=engine)
session = Session()

class MyObject(object):
    def __init__(self, name, value):
        self.name = name
        self.value = value

    def __repr__(self):
        return "<MyObject(name=%s, value=%s)>" % \
            (self.name, self.value)

myObjectsTable = Table("myobjects", metadata,
    Column('id', Integer, primary_key=True),
    Column('name', String(25)),
    Column('value', String(25)),
    UniqueConstraint('name', 'value'),
)

metadata.create_all(engine)

mapper(MyObject, myObjectsTable)

myObject1 = MyObject('myObject1', 'value1')
myObject2 = MyObject('myObject2', 'value2')
session.add(myObject1)
session.add(myObject2)
session.flush() # writes to DB as expected
session.add(myObject1)
session.flush() # does nothing, as expected
myObject3 = MyObject('myObject1', 'value1')
session.add(myObject3)
session.flush() # raises an SQL IntegrityError, as expected

Of course the last session.flush() will not raise the exception
without the UniqueConstraint in the table definition, and the database
will essentially contain a duplicate entry, under a different primary
key. One could argue that perhaps a composite primary key would be
more in place, but a) this is an simplified example b) this would be
messy in a relational database and c) I think it would be logical that
SQLAlchemy would have knowledge of (and actually mirror) constraints
in the database? But perhaps there are valid arguments against this.

Regards,

Arn Vollebregt
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to