thomasmueller commented on code in PR #646:
URL: https://github.com/apache/jackrabbit-oak/pull/646#discussion_r936801356


##########
oak-store-document/src/main/java/org/apache/jackrabbit/oak/plugins/document/mongo/MongoDocumentStore.java:
##########
@@ -1411,7 +1411,7 @@ public <T extends Document> void prefetch(Collection<T> 
collection,
         List<String> resultKeys = new ArrayList<>(keys.size());
         CacheChangesTracker tracker = null;
         if (collection == Collection.NODES) {
-            tracker = nodesCache.registerTracker(keys);
+            tracker = nodesCache.registerTracker(new HashSet<>(keys));

Review Comment:
   Is this to remove duplicates? Using HashSet will re-order the entries, but I 
guess that's fine. 
   
   Possibly it makes sense to sort the entries so that MongoDB can retrieve 
them faster (a MongoDB specific optimization); I could imagine that MongoDB 
doesn't do that itself. Or so that compression over the wire can work better (I 
_think_ LZ4 is used?)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to