Named queries and the getObject* will/can use the object cache.
Normal queries should refresh.

I'm not totally clear on the object cache workings.  I know I've had
things live for weeks in it before (actually helped us out with a
production problem -- we could get the old data and print it so we
didn't lose it, but I'd argue that was a bad design decision on my
part).  There was talk about adding new cache options to 3.0, but I
don't believe you are using that branch?  (I think Andrus may have
added LRU functionality or similar?)

/dev/mrg


On 11/15/06, Tomi NA <[EMAIL PROTECTED]> wrote:
2006/11/15, Michael Gentry <[EMAIL PROTECTED]>:

> DB and it worked well.  As far as things to pay attention to ... I'd
> use optimistic locking (which I'd recommend for a web-based
> application, too) and refresh data you think needs to be refreshed.
> Refreshing data is really an application/workflow specific detail, so
> it is hard to generalize.  Some things are obvious, like you can cache
> the State and Zip Code tables, for example.  Other things you'll have
> to evaluate depending on your situation.  It hardly ever hurts to
> refresh data from common queries, unless the query is really slow.

As far as refresh is concerned...
Cayenne looks to the database for every query by default, correct? If
an object changes after performQuery or getObjectByID returns, how
would I go about refreshing the cache?
Simply repeat getObjectByID or performQuery?
So far I've taken the paranoid approach so everything comes straight
from the database (i.e., I haven't paid any attention to the cache):
I'll look into the possibilities of caching some tables. which
shouldn't change.

How does the "max. number of objects" fit into the story? What happens
if I run a query which returns more then "max. number of objects"
objects (I've tried it, but can't explain the result)?

t.n.a.

Reply via email to