Hey all,

I've been thinking about an optimization problem that I'm vaguely familiar
with and not quite sure how to get started on.

I've got an application in which each record might be associated with
thousands of others through a join model. A has_many :through situation. The
join models are important in and of themselves, but often I want to just
grab all the associated objects, and this is starting to get a bit
burdensome on the database.

I'm tentatively thinking that denormalization would help me out here. But
that sort of thing is approaching the limits of my database knowledge. The
question comes down to this: say you want to cache the primary keys of
thousands of associated objects. What would that look like in your schema?
In your queries?

Like I said, database noob here, so let's have the noob explanation. Also,
pointers to books or tutorials are welcome. I'd welcome some looks at
alternate caching strategies — this information doesn't necessarily have to
persist — but denormalization is something I would like to know more about
in general.

-- 
Nick Zadrozny • beyondthepath.com
_______________________________________________
Sdruby mailing list
[email protected]
http://lists.sdruby.com/mailman/listinfo/sdruby

Reply via email to