Hi,

our system uses jackrabbit 2.6.5 and MySQL DB datastore. Jackrabbit DB
schema size is 300GB, most of it is in datastore. When we run jackrabbit
garbage collector, it runs almost 3 days. Running GC has significant impact
on application performance.

Could you please advice what possibility we have?

Somehow spit GC to do not iterate through whole datastore? When GC is not
finished completely, we can not run datastore clean because we can not be
sure what has been scanned and what has not.

Or is there any other GC implementation?


Thank you very much.

Vlastimil



--
View this message in context: 
http://jackrabbit.510166.n4.nabble.com/How-to-clean-huge-datastore-tp4661356.html
Sent from the Jackrabbit - Users mailing list archive at Nabble.com.

Reply via email to