I gather you are referring to saving the node that has all the references as
attributes?

ie:

NodeA ---> NodeX *

Where there are 100 000 NodeX instances. NodeX.getReferences() would then
return just one NodeA (in this case).
So when you save NodeA - you are saving all the references - and I would
expect that to be expensive. However, when you save NodeX - that should be
reasonably fast right?

If that is the case, could you try and "refactor" your repository (is that
the right word ?? ;)  so that there is a child node of NodeA which has all
the references, and if you are making a simple field change to nodeA (not
the references) just save it? (not its child).

Just some thoughts... (I am still learning myself, so this would help me to
know).

On 9/11/06, Christoph Kiehl <[EMAIL PROTECTED]> wrote:

Hi,

I'm experiencing an extreme performance/storage problem with references.
In my
case I got node(document) that reference another node(state) by reference.
This
state node is referenced by about 100.000 document nodes (still growing to
about
1.000.000). As NodeReferences get persisted on every change, there is a
lot of
data written to the database on each added reference.
What is the best way to handle this number of references? Is there a
better way
than saving the state uuid as string property an hence give up referencial
integrity?

Cheers,
Christoph


Reply via email to