ayushtkn commented on code in PR #6928:
URL: https://github.com/apache/hadoop/pull/6928#discussion_r1769491905
##########
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/erasurecode/ErasureCodingWorker.java:
##########
@@ -172,4 +175,19 @@ public void shutDown() {
public float getXmitWeight() {
return xmitWeight;
}
+
+ public void setStripedReconstructionPoolSize(int size) {
+ Preconditions.checkArgument(size > 0,
+ DFS_DN_EC_RECONSTRUCTION_THREADS_KEY + " should be greater than 0");
+ this.stripedReconstructionPool.setCorePoolSize(size);
+ this.stripedReconstructionPool.setMaximumPoolSize(size);
+ }
+
+ @VisibleForTesting
+ public int getStripedReconstructionPoolSize() {
+ int poolSize = this.stripedReconstructionPool.getCorePoolSize();
+ Preconditions.checkArgument(poolSize ==
this.stripedReconstructionPool.getMaximumPoolSize(),
+ "The maximum pool size should be equal to core pool size");
Review Comment:
I think this isn't required, this looks like validating the
``ThreadPoolExecutor``, that ain't a hadoop thing & isn't required here in the
scope either
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]