Hi everyone. I took another stab at my chemical database program with SQLAlchemy 0.2.2 and got the foreign-character and other issues straightened out. My question now is, what does it mean that a DynamicMetaData can be bound to multiple engines? Does that mean simultaneously, or switching from one to another? I've got an API class with high-level methods for all database queries, and a concrete API_SQLAlchemy subclass. The URL is passed to the constructor and the engine is saved as self._engine. The trick is that there may be two instances active in one application, one for MySQL and one for SQLite, either to copy stuff from one database to another or to compare databases. I've got my metadata and tables in global variables, and at the beginning of every method I connect to the instance's engine, like so:
META = DynamicMetaData() CHEMICALS = Table("Chemical", META, Column(...), ...) SEARCH_NAME = Table("SearchName", META, Column(...), ...) class API_SQLAlchemy(API): def __init__(self, chemicals, url, **kw): self._engine = engine = create_engine(url, **kw) if chemicals is not None: # Create new db, overwriting any previous. META.connect(engine) META.drop_all() META.create_all() self._add_chemicals(chemicals) def _add_chemicals(self, chemicals): META.connect(self._engine) insert = CHEMICALS.insert() for chem in chemicals: # Make a dictionary 'dic' from object 'chem'. # Not using SQLAlchemy's ORM yet. insert.execute(dic) It all works, and the application is single-threaded so there's no concurrency. I'm just wondering if it's safe to use a global META variable this way when there are two instances active. Does META.connect() just change the connection like moving a power cord from one outlet to another? Or is it accumulating multiple connections simultaneously? Would I have to worry about two databases being updated simultaneously if there are two different engines connected to the same metadata at the same time? I tried to look through the SQLAlchemy source but the chain of method calls and subclasses was so deep I couldn't figure it out. Is there a way to have the unbound metadata in the global variable, and then copy it to a bound metadata in the constructor? Then I could have self._meta instead of self._engine. I see how you can copy a table to another metadata but not how you can clone the whole metadata. Or is there a better way to do this? By the way, the reason I'm inserting one by one rather than making a list of all 6112 records and inserting them at once is I get a "max allowable packet size exceeded" exception from MySQL. Should SQLAlchemy perhaps chunk the request into reasonable packet sizes? I could do it myself with say 20 or 50 records at a time, but that means doing busywork tasks the library could handle. -- Mike Orr <[EMAIL PROTECTED]> _______________________________________________ Sqlalchemy-users mailing list Sqlalchemy-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/sqlalchemy-users