warrenzhu25 commented on PR #37603:
URL: https://github.com/apache/spark/pull/37603#issuecomment-1230879241

   > This PR seems to have two theme. Do you think you can spin off this 
counting part first, @warrenzhu25 ?
   > 
   > ```scala
   > + private[storage] val numDeletedShuffles = new AtomicInteger(0)
   > ```
   > 
   > ```scala
   > + numDeletedShuffles.incrementAndGet()
   > ```
   > 
   > ```scala
   > + numDeletedShuffles.foreach { s =>
   > +   assert(bmDecomManager.numDeletedShuffles.get() == s)
   > + }
   > ```
   > 
   > ```scala
   > +    validateDecommissionTimestampsOnManager(
   > +       bmDecomManager,
   > +       fail = false,
   > +       numDeletedShuffles = Some(1))
   > ```
   
   The reason why I add `numDeletedShuffles` here is for testing purpose. In 
current implementation, both deleted and failed shuffles are considered as 
migrated, no way to distinguish them in test case. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to