Quoting Samnang <[email protected]>:
> Hi all,
> 
> Currently, I'm developing a rails app that are heavy generating xml
> from restful webservice. My xml representation of web service use
> nokogiri gem to generates xml format that match expected format from
> client. But the problem is data is quite big around 50, 000 records to
> pull out from the table(millions records). I just test in my local
> machine, it takes about 20 minutes to get the response from the
> request.
> 
> Do you any ideas on how to optimize this problem? I'm not sure if we
> don't use ActiveRecord, and we just use pure sql statement to pull out
> the data for generating xml, then the performance is huge faster or
> not?
> 

Using SQL and libxml2 (libxml-ruby gem) directly instead of ActiveRecord and
Nokogiri (which calls libxml-ruby) will cut the run time.  I would guess
between 2x and 10x, if the code is written with speed in mind.  And your code
will be bigger and uglier.

What's cheaper, computer time or programmer time?  How many times will this
generation be run?  And are there elapsed time constraints (e.g., an excellent
24 hour weather forecast that takes 28 hours to generate isn't useful).

Jeffrey

-- 
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en.

Reply via email to