Github user zsxwing commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22770#discussion_r227056168
  
    --- Diff: 
core/src/main/scala/org/apache/spark/api/python/PythonWorkerFactory.scala ---
    @@ -31,15 +32,15 @@ import org.apache.spark.security.SocketAuthHelper
     import org.apache.spark.util.{RedirectThread, Utils}
     
     private[spark] class PythonWorkerFactory(pythonExec: String, envVars: 
Map[String, String])
    -  extends Logging {
    +  extends Logging { self =>
     
       import PythonWorkerFactory._
     
       // Because forking processes from Java is expensive, we prefer to launch 
a single Python daemon,
       // pyspark/daemon.py (by default) and tell it to fork new workers for 
our tasks. This daemon
       // currently only works on UNIX-based systems now because it uses 
signals for child management,
       // so we can also fall back to launching workers, pyspark/worker.py (by 
default) directly.
    -  val useDaemon = {
    +  private val useDaemon = {
    --- End diff --
    
    Fixing these since I'm touching a lot of fields in this file.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to