Thanks Ard, Marcel.

RE the comments on memory usage.......we're seeing some pretty aggressive
heap usage with lots of these queries going on - the heap can consume 50 to
100MB every few seconds and then GC'ing back down. We've tuned the GC but
are there any memory settings we should be looking at to tune JackRabbit for
queries touching large node sets?

We've already looked into this in a bit of detail in the thread on
JackRabbit Caching: BundleCache vs ItemManager vs CacheManager
http://mail-archives.apache.org/mod_mbox/jackrabbit-users/200807.mbox/%3C90a
8d1c00807300458tabfbf6am5dab26204ac12...@mail.gmail.com%3e.

Regards,
Shaun





-----Original Message-----
From: Ard Schrijvers [mailto:a.schrijv...@onehippo.com] 
Sent: 26 January 2009 09:25
To: users@jackrabbit.apache.org
Subject: RE: Is using jcr:path in a query a performance bottleneck?

Hello Shaun,

Regarding avoiding path constraints queries taking long, you might
(will) also win a lot by making the where clause only a little more
specific. So, if you keep you where path constraint, but, at the same
time, add at you 'category nodes' like '/sport' or '/entertainment' a
mixin for this category, you can add element(*,myns:somemix) after your
path. This reduces the number of hits before the path constraints have
to be accounted for, reducing the number of path constaint lookups a
lot. 

> 
> this will probably be faster than a query with a path 
> constraint, though the memory consumption will stay high. 
> lucene loads all values of the property to sort by into 
> memory. depending on the size of the value this results in 
> quite significant memory usage.

If you only sort on date the memory useage for 200.000 nodes shouldn't
be to high. Upgrading to the latest released core should also reduce
memory useage a lot for queries using an 'order by', certainly when you
sort on 'sparse' properties. 

Regards Ard

> 
> regards
>  marcel
> 


Reply via email to