huaxiangsun commented on a change in pull request #3724:
URL: https://github.com/apache/hbase/pull/3724#discussion_r736784079



##########
File path: 
hbase-balancer/src/main/java/org/apache/hadoop/hbase/master/balancer/DoubleArrayCost.java
##########
@@ -66,17 +66,21 @@ void applyCostsChange(Consumer<double[]> consumer) {
   }
 
   private static double computeCost(double[] stats) {
+    if (stats == null || stats.length == 0) {
+      return 0;
+    }
     double totalCost = 0;
     double total = getSum(stats);
 
     double count = stats.length;
     double mean = total / count;
-
     for (int i = 0; i < stats.length; i++) {
       double n = stats[i];
       double diff = (mean - n) * (mean - n);
       totalCost += diff;
     }
+    // No need to compute standard deviation with division by cluster size 
when scaling.
+    totalCost = Math.sqrt(totalCost);

Review comment:
       A bit confused. totalCost is the sum of standard deviation. What is 
square root of this sum? 
   If there is no need to compute standard deviation, should the standard 
deviation be avoided at the first place?
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to