Le 2012-10-22 à 17:37, Jeremy Evans a écrit :
> On Monday, October 22, 2012 12:28:35 PM UTC-7, François Beausoleil wrote:
> Hi!
>
> I have a large result set which I'm streaming to CSV files. I know I could
> use #sql to get the SQL, and run that through psql and output plain old CSV
> files. I'm trying to avoid that, one of the reason being at the point where I
> know the query, I have no access to the username / password.
>
> Relevant parts are:
>
> io = File.open("output.csv", "w")
> DB = Sequel.connect("postgres://localhost/db")
> ds = DB[:large_table].filter(...) # millions of rows are returned
> CSV(io, :headers => false, :col_sep => ",") do |csv|
> ds.each do |row|
> values = [] # some code to do transformations
> csv << values
> end
> end
>
> I naively assumed #each didn't load the whole result set in memory. Is there
> a way to get 'streaming' results? I'm using PostgreSQL 9.1 from Ruby 1.9.2
> and Sequel 3.39.0, sequel_pg 1.6.0 and pg 0.14.1.
>
> Sequel 3.28+ supports Database#copy_table using the postgres adapter with the
> pg driver, which can be used to create CSV files from datasets very quickly,
> and it does stream results from the server.
Hmm, this is Database#copy_table, not Dataset#copy_table. I specifically need a
Dataset. I've gone down the system("psql -c 'COPY ( SELECT * FROM ... ) TO
stdout' > /path/to/outfile ...") route, so I'm good for now.
Thanks for pointers everyone!
François
--
You received this message because you are subscribed to the Google Groups
"sequel-talk" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/sequel-talk?hl=en.