Hi Chistiaan, On Jun 18, 2007, at 1:30 AM, Christiaan des Bouvrie wrote:
Hi David and Matthew,Thanks for the response. Regarding the design issues, I agree that it isa Java problem. In jdo this problem becomes more evident. If I want todelete an object in java, I need to set all references to null. In JDO Ineed to set all references to null and actually delete the object.
We actually have changed the specification in this regard, so that if you are using a mapping in which a single database artifact (e.g. primary key, foreign key) is used to describe two or more concepts, the deletion of an instance causes the cleanup of other instances that refer to that deleted instance. That is, if you delete an Employee, the collection of employees in the Department, the collection of Employees in the Projects, and the Insurance instances all get either deleted (if dependent) or their collections or references are removed.
In the case of object databases in which there are multiple artifacts, then the issue is a bit more tricky in that there is no standard way to describe the relationship that should be deleted. But such metadata could be added to assist in this regard.
I noticed that a lot of beginners with JDO assume that when deleting theobjects the other part, setting references to null, is doneautomatically, which is not the case. In any case it is something whichneeds to be dealt with and it is now left to the developer to do it. What I do now is that for each object I determine which objects are referencing it. This usually comes down to executing queries where I check whether a certain attribute is or contains the object. I figuredwith all the available jdo meta data this could be something that can be done by the framework? Moreover, this code easily breaks on refactoring.That would also be nicely solved if the jdo metadata is used.
Do all of your examples use delete as the cause of the issue?Would it be better to describe the delete semantics instead of having the user worry about it?
Craig
Regarding the performance, I am not sure what makes this solution slower than the current solution, doing it yourself, that is, if my assumptionthat it can be done with the jdo metadata is correct. Collecting themetadata is probably a fraction of the actual time to delete the object in the datastore. So the performance decrease will be caused by queryingattributes which could possibly contain the object according to themetadata, but which you as a developer know is not the case. I would say if performance becomes a problem in that case you can always go back bydoing it yourself, or provide some extra metdata togetAllReferencingObjects so these classes can be skipped. This is alsothe approach I have currently. I first use my generic delete algorithm to recursively determine what needs to be deleted and what referencesneed to be updated. If I notice or know this will be slow due to a largecollection of objects I will switch to a delete-by-query and optimize the number of objects that need to be updated. But again, it is hardlyever considered slow due to the getAllReferencingObjects(), but more dueto the actual deleting of large collections of objects using pm.deletePersistentAll(). Kind regards, Christiaan
Craig Russell Architect, Sun Java Enterprise System http://java.sun.com/products/jdo 408 276-5638 mailto:[EMAIL PROTECTED] P.S. A good JDO? O, Gasp!
smime.p7s
Description: S/MIME cryptographic signature