magibney commented on code in PR #2076:
URL: https://github.com/apache/solr/pull/2076#discussion_r1394780552


##########
solr/core/src/java/org/apache/solr/cluster/placement/plugins/OrderedNodePlacementPlugin.java:
##########
@@ -515,6 +521,7 @@ public final void removeReplica(Replica replica) {
           });
       if (hasReplica.get()) {
         removeProjectedReplicaWeights(replica);
+        allReplicas.remove(replica);

Review Comment:
   Now that we're separately tracking replicas in a top-level map (and assuming 
that this is all accessed by a single thread), I think this method could be 
simplified to:
   ```java
   if (allReplicas.remove(replica)) {
     // [the nested removal logic]
   }
   ```



##########
solr/core/src/java/org/apache/solr/cluster/placement/plugins/OrderedNodePlacementPlugin.java:
##########
@@ -404,20 +404,25 @@ public abstract static class WeightedNode implements 
Comparable<WeightedNode> {
     private final Node node;
     private final Map<String, Map<String, Set<Replica>>> replicas;
 
+    // a flattened list of all replicas, as computing from the map could be 
costly
+    private final Set<Replica> allReplicas;
+
     public WeightedNode(Node node) {
       this.node = node;
       this.replicas = new HashMap<>();
+      this.allReplicas = new HashSet<>();
     }
 
     public Node getNode() {
       return node;
     }
 
     public Set<Replica> getAllReplicasOnNode() {
-      return replicas.values().stream()
-          .flatMap(shard -> shard.values().stream())
-          .flatMap(Collection::stream)
-          .collect(Collectors.toSet());
+      return new HashSet<>(allReplicas);

Review Comment:
   based on how this is being used currently, there's no need to wrap this in a 
new HashSet. The method is public, so you never know who might call it and what 
their assumptions might be; but you are replacing `Collectors.toSet()`, which 
specifically makes no guarantees about mutability/immutability, so I'd be 
inclined to just do `Collections.unmodifableSet(allReplicas)`, taking this 
opportunity to make the API more conservative, without breaking any backcompat.
   
   Also, if the motivating concern is "many replicas", the cost of 
instantiating a new HashMap that we don't really need is not worth it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to