Well, if you are talking about the MySQL query cache this is not a safe alternative for a query that is taking a very long time to execute. The query cache uses a global lock that can really cause nightmares for your DB. This lock is used during every single query against the table(s) AND every write against the table(s).

The real issue comes down to having such a slow query on a site. Caching isn't the solution for an un-optimized or poor query.

On Aug 5, 2009, at 10:45 AM, Michael Wieher wrote:


most databases have a caching mechanism... if your page requires
processing before display, consider caching slow-load pages in a
database and write an algorithm to invalidate this cache only if their
supporting data changes... at this time of invalidation it should be
easy to re-cache the page internally, thus making everyone happy I
hope.



On Wed, Aug 5, 2009 at 12:43 PM, Joseph Engo<[email protected]> wrote:

Blah, missed this message before I responded :D But ya, totally agreed here. If your site requires memcache in order to run, you are in serious trouble. My general rule is no queries on a site should take more then 0.2 seconds to run. Even with that, its a long time to wait on a single query.

On Aug 5, 2009, at 10:36 AM, dormando wrote:


What happens when we release 1.4.1?

Your site really should be able to tolerate at least the brief loss of a
single memcached instance.

It's definitely good to update the cache asyncronously where possible, and
all of this stuff stuff (a lot is detailed in the FAQ too). I'd just
really like to push the point that these techniques help reduce common load. They can even help a lot in emergencies when you find a page taking
50 seconds to load for no good reason.

However, relying on it will lead to disaster. It allows you to create sites which cannot recover on their own after blips in cache, upgrades,
hardware failure, etc.

If you have objects that really do take umpteen billions of milliseconds to load, you should really consider rendering data (or just the really slow components of it) into summary tables. I like using memcached to take queries that run in a few tens of milliseconds or less, and make them typically run in 0.1ms. If they take 20+ seconds something is horribly
wrong. Even 1 full second is something you should be taking very
seriously.

-Dormando

On Wed, 5 Aug 2009, Edward Goldberg wrote:


I use the term "Cache Warmer" for task like this, where I keep a key
(or keys) warm all of the time.

The best way to keep the load on the DB constant is to Warm the Cache
at a well defined rate.

If you wait for a "request" to update the cache, then you may loose
control over the exact moment of the load.

The idea is to make a list of the items that need to be warmed, and
then do that list at a safe rate and time.

Very long TTL values can be un-safe.  Old values can lead to code
errors that are not expected days later.

Edward M. Goldberg
http://myCloudWatcher.com/


On Wed, Aug 5, 2009 at 9:44 AM, dormando<[email protected]> wrote:

Also consider optimizing the page so it doesn't take 20 seconds to
render
- memcached should help you under load, and magnify capacity, but
shouldn't be used as a crutch for poor design.

-Dormando

On Wed, 5 Aug 2009, Adam Lee wrote:

Run a cron job that executes the query and updates the cache at an
interval
shorter than the expiration time for the cached item.

On Wed, Aug 5, 2009 at 11:38 AM, Haes <[email protected]> wrote:


Hi,

I'm using memcached together with Django to speed up the database queries. One of my Django views (page) uses a query that takes over 20 sec. Normally this query hits the memcached cache and the data is served almost instantly. The problem now is that if the cache expires, the next person accessing this page will have to wait about 20 seconds
for it to load which is not really acceptable for me.

Is there a way to update the memcached data before it times out, in a
way that this query always hits the cache?

Thanks for any hints.

Cheers.




--
awl






Reply via email to