On one of the projects I worked on, I had a rouge architect create and
use a SoftReference caching scheme without considering the
consequences.  Unfortunately, his scheme was rapidly adopted by quite a
few coders before I caught the trend and could raise a red flag. 
Naturally during loaded performance testing, it created quite a bit of
havoc.

Essentially, the problem is in the garbage collection.  As the
SoftReference cache accumulates more objects, garbage collection runs
slower (more objects to scan) and more frequent (less free heap
memory).  Since garbage collection stops ALL threads, the application
grinds down until there is still not enough room to allocate an object.
 Then, all objects on the SoftReference are released, which essentially
fully invalidates the cache.  Then the application has to rebuild its
immediate cache.

I remember seeing LRU in the documentation.  However, I believe that
had to do with the order in which the objects were released.  I think
there is something you can do to recover some of the objects before
they are destroyed...  I did not get into that too far, since the
garbage collection behaviors reduced the value of the whole idea.

Phillip.

--- James Stauffer <[EMAIL PROTECTED]> wrote:
> We use SoftReferences so the JVM manages it. I assume that it is
> close to
> LRU.
>  
> 
> James Stauffer 
> 
> -----Original Message-----
> From: Greg Nudelman [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, March 04, 2003 3:35 PM
> To: jdjlist
> Subject: [jdjlist] Re: Caching
> 
> 
> 
> Thanks a bunch Phillip! I'm looking carefully at this very nice
> implementation by the Turbine folks.  
> 
> I've also come across an Enhydra LRU cache manager class that uses
> the
> linked list as a sort of the heap structure to keep track of least
> recently
> used objects.  I was curious why they chose a linked list... I would
> think
> some kind of Java "heap" implementation will give the LRU element
> with the
> O(c) and will perform inserts with O(log2(n)). However, these guys
> keep the
> linked list pointers in the cache hash itself, along with the head
> and tail
> pointers, so every lookup and re-order operation becomes an O(c). 
> Very nice
> LRU implementation.
> 
> Greg 
> 
> 
> /*  * Enhydra Java Application Server Project  *   * The contents of
> this
> file are subject to the Enhydra Public License  * Version 1.1 (the
> "License"); you may not use this file except in  * compliance with
> the
> License. You may obtain a copy of the License on  * the Enhydra web
> site (
> http://www.enhydra.org/ <http://www.enhydra.org/>  ).  *   * Software
> distributed under the License is distributed on an "AS IS"  * basis,
> WITHOUT
> WARRANTY OF ANY KIND, either express or implied. See   * the License
> for the
> specific terms governing rights and limitations  * under the License.
>  *   *
> The Initial Developer of the Enhydra Application Server is Lutris  *
> Technologies, Inc. The Enhydra Application Server and portions
> created  * by
> Lutris Technologies, Inc. are Copyright Lutris Technologies, Inc.  *
> All
> Rights Reserved.  *   * Contributor(s):  *   * $Id: LRUCache.java,v
> 1.7.12.1
> 2000/10/19 17:58:53 jasona Exp $  */ 
> 
> 
> 
> 
> package com.lutris.util; 
> 
> import java.util.Hashtable; 
> 
> /** 
>  *   This class implements a fixed size Least-Recently-Used cache. 
>  *   The cache has a maximum size specified when it is created. When 
>  *   an item is added to the cache, if the cache is already at the 
>  *   maximum size the least recently used item is deleted, then the 
>  *   new item is added. <P> 
>  * 
>  *   The only two methods that count as "using" the object (for the 
>  *   least-recently-used algorithm) are <CODE>add()</CODE> and 
>  *   <CODE>get()</CODE>. <P> 
>  * 
>  *   The items cached are refered to by keys, just like a Hashtable. 
>  * 
>  * @author Andy John 
>  * @version $Revision: 1.7.12.1 $ 
>  */ 
> public class LRUCache { 
> 
>     /* 
>      *  Maximum allowed size of the cache. 
>      */ 
>     private int maxSize; 
> 
>     /* 
>      *  The current size of the cache. 
>      */ 
>     private int currentSize; 
> 
>     /* 
>      *  This hashtable stores all the items, and does all the
> searching. 
>      */ 
>     private Hashtable cache; 
> 
>     /* 
>      *  A linked list of these structs defines the ordering from 
>      *  newest to oldest. The nodes are stored in the hashtable, so 
>      *  that constant-time access can locate a node. 
>      */ 
>     private class Node { 
>         Node prev, next; 
>         Object key; 
>         Object item; 
>         int size; 
>     } 
> 
>     /* 
>      *  The linked list. This is ONLY for keeping an ordering. All 
>      *  searching is done via the hashtable. All acceses to this
> linked 
>      *  list are constant-time. 
>      */ 
>     private Node head, tail; 
> 
>     /** 
>      *   Create a new, empty cache. 
>      * 
>      * @param maxSize 
>      *  The maxmimum size of the items allowed to be in the cache. 
>      *  Since the size of an item defaults to 1, this is normally the
> 
>      *  number of items allowed in the cache. 
>      *  When this limit is reached, the "oldest" items are deleted to
> 
>      *  make room for the new item, so the total size of the cache 
>      *  (the sum of the sizes of all the items it contains) 
>      *  does not exceed the specified limit. 
>      */ 
>     public LRUCache(int maxSize) { 
>         this.maxSize = maxSize; 
>         clear(); 
>     } 
> 
>     /** 
>      *   Returns the total size of the items in the cache. 
>      *   If all the items were added with the default size of 1, 
>      *   this is the number of items in the cache. 
>      * 
>      * @return 
>      *  The sum of the sizes of the items in the cache. 
>      */ 
>     public synchronized int getSize() { 
>         return currentSize; 
>     } 
> 
>     /** 
>      *   Returns the maxmimum number of items allowed in the cache. 
>      * 
>      * @return 
>      *  The maximum number of items allowed in the cache. 
>      */ 
>     public synchronized int getMaxSize() { 
>         return maxSize; 
>     } 
> 
>     /** 
>      *   Add an object to the cache, with a default size of 1. 
>      *   If the cache is full, the oldest items will be deleted to
> make
> room. 
>      *   The item becomes the "newest" item. 
>      * 
>      * @param key 
>      *   The key used to refer to the object when calling
> <CODE>get()</CODE>. 
>      * @param item 
>      *   The item to add to the cache. 
>      */ 
>     public synchronized void add(Object key, Object item) { 
>         add(key, item , 1); 
>     } 
> 
>     /** 
>      *   Add an object to the cache. 
>      *   If the cache is full, the oldest items will be deleted to
> make
> room. 
>      *   The item becomes the "newest" item. 
>      * 
>      * @param key 
>      *   The key used to refer to the object when calling
> <CODE>get()</CODE>. 
>      * @param item 
>      *   The item to add to the cache. 
>      * @param size 
>      *   How large is the object? This counts against the maximim
> size 
>      *   specified when the cache was created. 
>      */ 
>     public synchronized void add(Object key, Object item, int size) {
> 
>         // Throw out old one if reusing a key. 
>         if (cache.containsKey(key)) 
>             remove(key); 
> 
=== message truncated ===


---
You are currently subscribed to jdjlist as: [EMAIL PROTECTED]
To unsubscribe send a blank email to [EMAIL PROTECTED]
http://www.sys-con.com/fusetalk

Reply via email to