I hope this is the right place for this... I need to load large files into 
my database. As I understand it, I can do this in one of two ways with 
SQLAlchemy Core: 1) Bring the data into Python and then write it out with 
the *add* method or, alternatively, 2) Use SQLAlchemy to issue a command to 
the DB to use it's native bulk loader to read data from my file. I would 
expect this second approach to be faster, to require less code, and to 
avoid issues such as trying to put too much in memory at one time. However, 
it is DB-vendor-specific (i.e. I believe the command I send to a MySQL DB 
will differ from that I send to a Postgres DB). 

So, 

   - Do I properly understand SQLAlchemy's capabilities here or am I 
   missing something?
   - If I do have this right, is generic access to bulk loaders something 
   that is on the upgrade / new development list?

Sorry if this isn't the right place for this.
Thanks!
Ben

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sqlalchemy/fd55ee9a-6918-4b60-88bb-961787e3c53e%40googlegroups.com.

Reply via email to