Am 19.10.2020 um 08:37 schrieb Tanvi Shah:
Hi ,

We have started Se Garbage Collection and S3 size is more than 2 TB.

We are facing memory issues while executing GC even when we have given 11 GB of 
memory to the application.


Code is:

final MarkSweepGarbageCollector gc = 
documentNodeStore.createBlobGarbageCollector(seconds, repository.toString(), 
wb, new DefaultStatisticsProvider(Executors.newScheduledThreadPool(1)));​

gc.collectGarbage(markOnly);
final OperationsStatsMBean stats = gc.getOperationStats();
log.info(
"number deleted : " + stats.numDeleted() + " Size deleted : " + 
stats.sizeDeleted());

The exceptions are :
...

It might be necessary to set the fetch size (see
<https://stackoverflow.com/questions/3682614/how-to-read-all-rows-from-huge-table>,
around
<https://github.com/apache/jackrabbit-oak/blob/5af43cb29fb6cfc226eac6f89bf4985d7ce6d89f/oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/rdb/RDBDocumentStoreJDBC.java#L590>).

Sorry, can't test that right now myself.


Best regards, Julian

Reply via email to