mreutegg commented on code in PR #920:
URL: https://github.com/apache/jackrabbit-oak/pull/920#discussion_r1206826064
##########
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/mongo/MongoDocumentStore.java:
##########
@@ -1282,6 +1373,86 @@ private Map<String, NodeDocument>
getCachedNodes(Set<String> keys) {
return nodes;
}
+ @NotNull
+ private <T extends Document> Map<UpdateOp, T> bulkModify(final
Collection<T> collection, final List<UpdateOp> updateOps,
+ final Map<String,
T> oldDocs) {
+ Map<String, UpdateOp> bulkOperations = createMap(updateOps);
+ Set<String> lackingDocs = difference(bulkOperations.keySet(),
oldDocs.keySet());
+ oldDocs.putAll(findDocuments(collection, lackingDocs));
+
+ CacheChangesTracker tracker = null;
+ if (collection == NODES) {
+ tracker = nodesCache.registerTracker(bulkOperations.keySet());
+ }
+
+ try {
+ final BulkRequestResult bulkResult = sendBulkRequest(collection,
bulkOperations.values(), oldDocs, false);
+ final Set<String> potentiallyUpdatedDocsSet =
difference(bulkOperations.keySet(), bulkResult.failedUpdates);
+
+ // fetch all the docs which haven't failed, they might have passed
+ final Map<String, T> updatedDocsMap = findDocuments(collection,
potentiallyUpdatedDocsSet);
Review Comment:
Isn't this rather expensive? Even in the happy case, this will read all
modified documents again from MongoDB, right?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]