Welcome.

I have a bit rare scenario.
In my case i need to be able to work with over 1500 remote databases and i 
cannot change this fact.

Reflecting all of them are not possible, because it consume to much time 
and resources, so need generate model for them.

Of course this process also take long time, and depends on dialect 
inspector implementation. For example by getting single table columns 
definition in mysql dialect its emitted one query, for postgresql four.

Even when i dump all of them into declarative base model, its still huge 
amount of data that need to be parsed and loaded.

I want to ask if its possible to share table/column definition across 
different database models to reduce amount of used resources?

if someone have idea how to organize/optimize structure for such 
application model then please share :)


Best regards.





-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/sqlalchemy/-/ZnXjMNSe9PYJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.

Reply via email to