Mike,
Thanks very much. I thought that might be the case, but was hoping to avoid 
even chunking, was hoping for pure streaming. I'll go with that approach.
FYI, I think I was thrown off by stream_results 
<http://docs.sqlalchemy.org/en/latest/core/connections.html#sqlalchemy.engine.Connection.execution_options.params.stream_results>;
 
I was thinking perhaps it was possible to use a generator to stream a huge 
insert, but I believe I misunderstood stream_results, which seems to be 
more about streaming large selects by the driver, instead of buffering 
entire result sets.
Thanks again.
Andy 

On Thursday, June 16, 2016 at 12:32:08 PM UTC-4, Andy Crain wrote:
>
> Hi,
>
> I'm attempting to do a bulk insert from a large .csv file. I've read 
> through the various options at 
> http://docs.sqlalchemy.org/en/latest/faq/performance.html#i-m-inserting-400-000-rows-with-the-orm-and-it-s-really-slow,
>  
> and I would like to perform a bulk insert using Core, along the lines of:
>
> engine.execute(MyTable.__table__.insert(), [...])
>
> Instead of a list, though, I'd like to use a generator, as I'm very memory 
> constrained. When I try something like this:
>
> engine.execute(MyTable.__table__.insert(), (d for d in my_dict_generator
> ()) )
>
> ...I get "AttributeError: 'list' object has no attribute 'keys'".
>
> Is it not possible to use a generator in lieu of a list here?
>
> Thanks,
> Andy 
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to