Yes by using read/write locks. Didn't use ConcurrentHashMap after all
due to lack of LRU in it.
Br's,
marius
On Apr 12, 9:07 pm, Timothy Perrett timo...@getintheloop.eu wrote:
Interesting marius - I havent checked out the code yet, but did you
manage to work around the thread safe issues?
Just checked out the code Marius... this is good stuff!
So one could use the SoftReferenceCache generically as a thread safe
replacement for KeyedCache right? The template cache is pretty sweet
also.
Cheers, Tim
On Apr 13, 7:20 am, marius d. marius.dan...@gmail.com wrote:
Yes by using
Interesting marius - I havent checked out the code yet, but did you
manage to work around the thread safe issues?
Cheers, Tim
On Apr 11, 9:16 pm, marius d. marius.dan...@gmail.com wrote:
I just committed a SoftReferenceCache implementation in utils and it
is used by InMemoryCache. So far
I just committed a SoftReferenceCache implementation in utils and it
is used by InMemoryCache. So far testing it looks pretty good. The
point of a SoftReferenceMap is probably obvious to prevent (as much as
possible) OOME. This is fine IMHO cause it's more important to have
the application
I committed it of course in wip-marius-template-cache branch.
On Apr 11, 11:16 pm, marius d. marius.dan...@gmail.com wrote:
I just committed a SoftReferenceCache implementation in utils and it
is used by InMemoryCache. So far testing it looks pretty good. The
point of a SoftReferenceMap is
Good :) ... I was also thinking on a flush-able caching mechanism. So
far the InMemoryCache is more for exemplification as it is not yet
thread safe. It is based on LRU cache but I'm also thinking to also
combine the ConcurrentHashMap approach with LRU ... also I was
thinking to a SoftReference
Wow, derek you must be watching Github like a hawk haha ;-)
Just to bring an off list convo between myself and Marius onto the list, are
we looking at having some generic caching infrastructure in lift? This would
be great re the localization / translation stuff im working on which
currently
Just taken a look over the code - looks pretty cool!
I like your ideas for ConcurrentHashMap - all sounds pretty awesome...
regarding the use of EHCache, I rekon as long as provide a hook
mechinism into the cache system, then sure, we should let people worry
about those issues in there specific
On Apr 5, 11:21 pm, Timothy Perrett timo...@getintheloop.eu wrote:
Just taken a look over the code - looks pretty cool!
I like your ideas for ConcurrentHashMap - all sounds pretty awesome...
regarding the use of EHCache, I rekon as long as provide a hook
mechinism into the cache system,
True, very true. I know DPP is generally against caching, but we all
recognise the need to caching in a production environment. Perhaps, rather
than asking if we should re-invent the wheel with a specific cache mech
within lift, perhaps my quesiton is this:
Is LRU and the KeyedCache abstraction
On Apr 6, 12:06 am, Timothy Perrett timo...@getintheloop.eu wrote:
True, very true. I know DPP is generally against caching, but we all
recognise the need to caching in a production environment. Perhaps, rather
than asking if we should re-invent the wheel with a specific cache mech
within
Sounds pretty awesome to me Marius - looking forward to your thoughts
Cheers, Tim
On 05/04/2009 22:22, marius d. marius.dan...@gmail.com wrote:
These are valid questions. LRU KeyedCache utilizes
org.apache.commons.collections.map.LRUMap which is not thread safe. So
anyone can use them and
I agree. Thread safety would be nice, and should be easily achievable with
some existing code. The beauty of traits is that we can get these orthogonal
behaviors through composition.
Derek
On Sun, Apr 5, 2009 at 2:21 PM, Timothy Perrett timo...@getintheloop.euwrote:
Just taken a look over the
13 matches
Mail list logo