So, the advantage really comes in to play when you retrieving hundreds
or even thousands of keys to retrieve in one page request. For example,
on the dealnews.com front page, we show 100-200 articles. Now, we don't
use memcached for that (whole other philosophy in use there), but I can
use it as an example. Consider this code:
// get array of keys for front page articles
$art_keys = $memcache->get("front_page_articles");
// PHP object takes an array to do multi gets
$articles = $memcache->get($art_keys);
// now I have a whole array of articles in one call.
So, even with multiple memcached servers, I only need one call to each
server for a whole set of keys. Otherwise, I would have to grab lots of
keys one at a time and that would be slower. Much slower.
Brian.
--------
http://brian.moonspot.net/
On 8/21/09 12:10 PM, Bill Moseley wrote:
I've often seen comments about using multi-gets for better
performance. First, I'm a bit curious if/why it's a big performance gain.
Obviously, not all keys will be on the same memcached server in a
multi-get request. Can somoene give me the one short explanaition or
point me to where to learn more?
I'm perhaps more curious how people use multiple gets -- or really the
design of an web application that supports this.
I've tended to cache at a pretty low level -- that is if I have a method
in the data model to fetch object "foo" I tend to check the cache, if
not found then fetch from the database and save to memcached. As a
result we can end up with a lot of separate memcached get requests for a
single web-request. This is less true as the app becomes more
web2.0-ish, but still happens.
So, I'm curious about the design of an application that supports
gathering up a number of (perhaps unrelated) cache requests and fetch
all at once. Then fill in the cache where there are cache-misses. Am I
misunderstanding how people use this feature?
--
Bill Moseley
[email protected] <mailto:[email protected]>