prakharjain09 commented on a change in pull request #28370:
URL: https://github.com/apache/spark/pull/28370#discussion_r416781804



##########
File path: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
##########
@@ -1829,7 +1895,52 @@ private[spark] class BlockManager(
     data.dispose()
   }
 
+  /**
+   * Class to handle block manager decommissioning retries
+   * It creates a Thread to retry offloading all RDD cache blocks
+   */
+  private class BlockManagerDecommissionManager(conf: SparkConf) {
+    @volatile private var stopped = false
+    private val blockReplicationThread = new Thread {
+      override def run(): Unit = {
+        while (blockManagerDecommissioning && !stopped) {
+          try {
+            logDebug("Attempting to replicate all cached RDD blocks")
+            decommissionRddCacheBlocks()
+            logInfo("Attempt to replicate all cached blocks done")
+            val sleepInterval = conf.get(
+              config.STORAGE_DECOMMISSION_REPLICATION_REATTEMPT_INTERVAL)
+            Thread.sleep(sleepInterval)
+          } catch {
+            case _: InterruptedException =>
+              // no-op
+            case NonFatal(e) =>
+              logError("Error occurred while trying to " +
+                "replicate cached RDD blocks for block manager 
decommissioning", e)
+          }
+        }
+      }
+    }
+    blockReplicationThread.setDaemon(true)
+    blockReplicationThread.setName("block-replication-thread")
+
+    def start(): Unit = {
+      logInfo("Starting block replication thread")
+      blockReplicationThread.start()
+    }
+
+    def stop(): Unit = {
+      if (!stopped) {
+        stopped = true
+        logInfo("Stopping block replication thread")
+        blockReplicationThread.interrupt()
+        blockReplicationThread.join()

Review comment:
       Yeah - But all the tests that are failing in jenkins build are not the 
ones written in this PR. So that means all those tests must be running with 
storage-decommissioning-flag disabled?
   
   1. 
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/121988/testReport/
   2. 
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/121917/testReport/
   
   Is the same Spark application going to get used across all these tests? I 
was assuming that new SparkApp will be created and destroyed for my specific 
tests (as BlockManagerDecommissionSuite creates new SparkContext as part of 
test).




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to