My progress so far has been to configure my Metadata.schema before importing my Declarative objects (which is going to associate ALL my tables with that schema), and then filter on the tables that I know belong to that particular schema. However that only adds a limited subset of the tables in the created version script, I want to add all modifications in 1 script.
On Tuesday, January 14, 2014 3:09:55 PM UTC+1, Dimitris Theodorou wrote: > > Hi, > > Also posted this question at > http://stackoverflow.com/questions/21109218/alembic-support-for-multiple-postgres-schemas, > > not sure what the best place to continue is. > > My problem is the following: > > I have one SQL alchemy model with no schema specifications. In my database > though I duplicate several tables over multiple schemas, which correspond > to different application users. Every schemas that contain subsets of the > SQL alchemy model's tables. The schema is set at the application run-time > to the proper value based on the logged-in user, with session.execute("SET > search_path TO client1,shared") for example. I also have a shared schema > which contains some tables; the rest of the tables are duplicated over > multiple schemas. > > I want to use Alembic's --autogenerate to migrate all my schemas. The > default --autogenerate behavior is detecting multiple schemas that do not > exist in the model and ends up deleting the schemas and re-creating every > table in the default schema. > > I would really like to use --autogenerate though, with the proper > plumbing to set the schemas correctly. Any suggestions on if/how Alembic's > API can do this? > Regards, > Dimitris > -- You received this message because you are subscribed to the Google Groups "sqlalchemy-alembic" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.
