The other issue I've had with eager loading is when I want to do a
rake task to compute on tens of thousands or hundreds of thousands of
records.
I could easily blow my memory by loading them all at once.  Instead
it's better to eager-load in 'batches' or 'pages.'

I've dealt with this by monkey-patching in the following:

 module DataMapper
    class Collection
      def batch_each(size=100,&block)
        start = 0
        loop do
          n = all(:offset=>start,:limit=>size).each{|x| block.call
(x)}.size
          break if n < size  # feels icky, why can't ruby have a run-
at-least-once do..while?
          start += size
        end
      end
    end
  end

It's really pleasing to me because the all(:offset=>,:limit=>) scopes
off of the prior query, and the batch_each is as easy to write as
each.

At some point I might get around to writing an each_with_index
version, but that would require me writing my own
"each_with_offsetted_index" for enumerable that takes in a starting
value.

-Gary
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"DataMapper" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/datamapper?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to