MaxGekk commented on a change in pull request #24586: [SPARK-27682][CORE][GRAPHX][MLLIB] Replace use of collections and methods that will be removed in Scala 2.13 with work-alikes URL: https://github.com/apache/spark/pull/24586#discussion_r283111395
########## File path: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala ########## @@ -468,21 +468,20 @@ private[spark] class DAGScheduler( /** Find ancestor shuffle dependencies that are not registered in shuffleToMapStage yet */ private def getMissingAncestorShuffleDependencies( - rdd: RDD[_]): ArrayStack[ShuffleDependency[_, _, _]] = { - val ancestors = new ArrayStack[ShuffleDependency[_, _, _]] + rdd: RDD[_]): ListBuffer[ShuffleDependency[_, _, _]] = { + val ancestors = new ListBuffer[ShuffleDependency[_, _, _]] val visited = new HashSet[RDD[_]] // We are manually maintaining a stack here to prevent StackOverflowError // caused by recursively visiting - val waitingForVisit = new ArrayStack[RDD[_]] - waitingForVisit.push(rdd) + val waitingForVisit = ListBuffer[RDD[_]](rdd) while (waitingForVisit.nonEmpty) { - val toVisit = waitingForVisit.pop() + val toVisit = waitingForVisit.remove(0) if (!visited(toVisit)) { visited += toVisit getShuffleDependencies(toVisit).foreach { shuffleDep => if (!shuffleIdToMapStage.contains(shuffleDep.shuffleId)) { - ancestors.push(shuffleDep) - waitingForVisit.push(shuffleDep.rdd) + ancestors.+=:(shuffleDep) Review comment: You can write: ```scala shuffleDep +=: ancestors ``` Just in case, any operator ending in `:` is right-associative in Scala ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org