On Tuesday, July 4, 2017 at 6:15:09 AM UTC-7, Aryk Grosz wrote: > > I meant for eager. In the case of eager. You would prefetch say 100 > records at a time. Then get the assocations for those using foreign keys. > So you would still have let's the main query + 2 associations = 3 total > queries vs potentially N queries right? >
No, it's main query + query per association per page (instead of per association per row). It does reduce the number of queries, but you lose the properties of a normal eager load, where all associated rows are represented by a single object, and I'm pretty sure it would break more advanced eager loading cases. > So I'm guess you have kind of just preload them with as_hash and manually > append it into your paged_each call > You can't preload with as_hash without knowing which records are going to be selected by the main query. You could potentially use as_hash on the entire associated dataset, but that may be a much more expensive query, and is not something that could be done by the library. > I suppose you can do something like > > Model.order(:a).enum_for(:paged_each, options).each_slice... > > And then for each slice go and ge the associations, but wouldn't it just > be easier to call all in the paged-each code (assuming you are using eager). > We could give people an :all option to paged_each, but it's a potential foot gun. People just need to understand that streaming/paging and eager loading are fundementally not compatible. You can use the each_slice workaround if you want to, but that's not something I want to encourage. Thanks, Jeremy -- You received this message because you are subscribed to the Google Groups "sequel-talk" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/sequel-talk. For more options, visit https://groups.google.com/d/optout.
