shahidki31 commented on a change in pull request #24398: 
[SPARK-27468][Core][WEBUI] BlockUpdate replication event shouldn't overwrite 
storage level description in the UI
URL: https://github.com/apache/spark/pull/24398#discussion_r281041842
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/status/AppStatusListener.scala
 ##########
 @@ -917,8 +917,24 @@ private[spark] class AppStatusListener(
     // Update the block entry in the RDD info, keeping track of the deltas 
above so that we
     // can update the executor information too.
     liveRDDs.get(block.rddId).foreach { rdd =>
+
       if (updatedStorageLevel.isDefined) {
-        rdd.setStorageLevel(updatedStorageLevel.get)
+        // Replicated block update events will have 
`storageLevel.replication=1`.
+        // To avoid overwriting the block replicated event in the store, we 
need to
+        // have a check for whether the event is block replication or not.
+        // Default value of  `storageInfo.replication = 1` and hence if
+        // `storeLevel.replication = 2`, the replicated events won't overwrite 
in the store.
+        val storageInfo = rdd.storageInfo
+        val isReplicatedBlockUpdateEvent = storageLevel.replication < 
storageInfo.replication &&
 
 Review comment:
   Hi, This line  checks the storageLevel is valid or not. 
https://github.com/apache/spark/blob/d9bcacf94b93fe76542b5c1fd852559075ef6faa/core/src/main/scala/org/apache/spark/status/AppStatusListener.scala#L916-L920
  If not valid, then the `updatedStorageLevel` will  be `None`.  So, it won't 
come to this line (L-928). 
   Thanks
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to