On Monday, October 22, 2012 12:28:35 PM UTC-7, François Beausoleil wrote:
>
> Hi!
>
> I have a large result set which I'm streaming to CSV files. I know I could
> use #sql to get the SQL, and run that through psql and output plain old CSV
> files. I'm trying to avoid that, one of the reason being at the point where
> I know the query, I have no access to the username / password.
>
> Relevant parts are:
>
> io = File.open("output.csv", "w")
> DB = Sequel.connect("postgres://localhost/db")
> ds = DB[:large_table].filter(...) # millions of rows are returned
> CSV(io, :headers => false, :col_sep => ",") do |csv|
> ds.each do |row|
> values = [] # some code to do transformations
> csv << values
> end
> end
>
> I naively assumed #each didn't load the whole result set in memory. Is
> there a way to get 'streaming' results? I'm using PostgreSQL 9.1 from Ruby
> 1.9.2 and Sequel 3.39.0, sequel_pg 1.6.0 and pg 0.14.1.
>
Sequel 3.28+ supports Database#copy_table using the postgres adapter with
the pg driver, which can be used to create CSV files from datasets very
quickly, and it does stream results from the server.
Thanks,
Jeremy
--
You received this message because you are subscribed to the Google Groups
"sequel-talk" group.
To view this discussion on the web visit
https://groups.google.com/d/msg/sequel-talk/-/LdHZ-2g7cN8J.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/sequel-talk?hl=en.