dongjoon-hyun commented on a change in pull request #24113: [MINOR][CORE] 
Replace Scala forkjoin package to Java bundled one
URL: https://github.com/apache/spark/pull/24113#discussion_r266267196
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/util/ThreadUtils.scala
 ##########
 @@ -181,17 +180,17 @@ private[spark] object ThreadUtils {
   }
 
   /**
-   * Construct a new Scala ForkJoinPool with a specified max parallelism and 
name prefix.
+   * Construct a new ForkJoinPool with a specified max parallelism and name 
prefix.
    */
-  def newForkJoinPool(prefix: String, maxThreadNumber: Int): SForkJoinPool = {
+  def newForkJoinPool(prefix: String, maxThreadNumber: Int): ForkJoinPool = {
 
 Review comment:
   Is this okay for Scala-2.11, too? We still need to support Scala-2.11, don't 
we?
   
   According to SPARK-13398, this was a hack for Scala-2.11 because we can't 
use Java's ForkJoinPool directly in Scala 2.11 since it uses a ExecutionContext 
which reports system parallelism.
   
   Hi, @holdenk . Could you give us some opinion on this? The old issue is 
resolved now? Are we safe to revert the old SPARK-13398?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to