Hi!

I have a large result set which I'm streaming to CSV files. I know I could use 
#sql to get the SQL, and run that through psql and output plain old CSV files. 
I'm trying to avoid that, one of the reason being at the point where I know the 
query, I have no access to the username / password.

Relevant parts are:

io = File.open("output.csv", "w")
DB = Sequel.connect("postgres://localhost/db")
ds = DB[:large_table].filter(...) # millions of rows are returned
CSV(io, :headers => false, :col_sep => ",") do |csv|
  ds.each do |row|
    values = [] # some code to do transformations
    csv << values
  end
end

I naively assumed #each didn't load the whole result set in memory. Is there a 
way to get 'streaming' results? I'm using PostgreSQL 9.1 from Ruby 1.9.2 and 
Sequel 3.39.0, sequel_pg 1.6.0 and pg 0.14.1.

Thanks!
François

-- 
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/sequel-talk?hl=en.

Reply via email to