GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/19782

    [SPARK-22554][PYTHON] Add a config to control if PySpark should use daemon 
or not

    ## What changes were proposed in this pull request?
    
    This PR proposes to add a flag to control if PySpark should use daemon or 
not. 
    
    Actually, SparkR already has a flag for useDaemon:
    
https://github.com/apache/spark/blob/478fbc866fbfdb4439788583281863ecea14e8af/core/src/main/scala/org/apache/spark/api/r/RRunner.scala#L362
    
    It'd be great if we have this flag too. It makes easier to test Windows 
specific issue.
    
    ## How was this patch tested?
    
    Manually tested.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark use-daemon-flag

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19782.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19782
    
----
commit f41698e330c517830a90309a022b072ea6406dcb
Author: hyukjinkwon <gurwls...@gmail.com>
Date:   2017-11-19T05:10:19Z

    Add a config to control if PySpark should use daemon or not

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to