On 4/23/15 5:15 PM, Jeffrey Yan wrote:
I have a couple of tables that I want to reflect. The first is a data
table where one of the columns is a foreign key to the second table.
If I used SQLAlchemy declarative_base, a query might look something
like this:
|
session.query(Client.name,Suburb.label).join(Suburb)# In the Client
class there is an attribute suburb_id = Column(Integer,
ForeignKey(Suburb.id))
|
/
/
However, this foreign key is not specified in the schema (we're using
postgres 9.2) but we know all the columns that look like something_id
are foreign keys, so I've been defining them that way using SQLAlchemy.
My problem is, although we have a fixed number of property tables
(suburb, country, join_date, ...) - each data table (per client) can
have a different set of columns.
This hasn't been much of a problem so far, since we only have a few
/types/ of client data tables, so the combinations have been limited.
However, I'd like to cater for changes in the future.
If I reflect the table using SQLAlchemy, the resultant table will not
have the ForeignKey columns compared to if I did it manually. Is there
a way to add these in after reflection?
you can do this by instrumenting the process of the Table being built
up. It's weird enough that I had to write a demo to verify it works,
so here it is:
from sqlalchemy import create_engine
from sqlalchemy import event
from sqlalchemy import MetaData, ForeignKey, Column
import re
e = create_engine("postgresql://scott:tiger@localhost/test")
conn = e.connect()
trans = conn.begin()
conn.execute("""
create table foo (id integer primary key)
""")
conn.execute("""
create table bar (id integer primary key, foo_id integer)
""")
metadata = MetaData()
@event.listens_for(Column, "before_parent_attach")
def associate_fk(column, table):
# if you want to limit the event's scope; a good idea
# else this will take place for Column objects everywhere
if table.metadata is not metadata:
return
m = re.match(r"(.+)_id", column.name)
if m:
tname = m.group(1)
column.append_foreign_key(ForeignKey('%s.id' % tname))
metadata.reflect(conn)
foo = metadata.tables['foo']
bar = metadata.tables['bar']
assert bar.c.foo_id.references(foo.c.id)
Or is my only option to use reflected tables and explicit join
conditions? Something like:
|
client_table_1
=Table('client_table_1',metadata,autoload=True,autoload_with=engine,schema='client_1')
session.query(client_table_1.c.name,Suburb.label).join(Suburb,client_table_1.c.suburb_id
==Suburb.id)# Explicit joins only from now on
|
--
You received this message because you are subscribed to the Google
Groups "sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to [email protected]
<mailto:[email protected]>.
To post to this group, send email to [email protected]
<mailto:[email protected]>.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.