On Friday 10 July 2009, gsterndale wrote:
> I agree 100% that using FasterCSV and ActiveRecord to pull the data
> is much more portable (and elegant). In fact, that's how I'm doing it
> now. However, this is a huge dataset that is causing server timeouts
> and hogs memory. I'm investigating csv generation in Postgres as it
> takes a fraction of the time and resources because each object isn't
> getting instantiated.
>
> Any thoughts?

If your dataset is truly is huge an approach that relies on first 
extracting the entire dataset and then serving it is not going to work 
reliably. If it doesn't fail outright due to a timeout, it is still 
going to strain your memory considerably.

As far as I can see, there are two ways around this.

If the export doesn't have to be current to this very instance, set up a 
background job that exports it regularly to a file and leave the serving 
to the web server.

If that doesn't meet your needs, consider streaming the data. Have a 
look at the send_data (controller) method. It may even be sensible to 
handle this in Metal, i.e. a Rack module. I haven't tried any of this 
myself, the following link might be helpful

http://amberbit.com/blog/2009/04/15/ruby-flv-pseudostreaming-
implemented-using-sinatra-and-rack-evil-useful-for-rails-too/

Michael

-- 
Michael Schuerig
mailto:[email protected]
http://www.schuerig.de/michael/


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to