Hyukjin Kwon created SPARK-22554:
------------------------------------

             Summary: Add a config to control if PySpark should use daemon or 
not
                 Key: SPARK-22554
                 URL: https://issues.apache.org/jira/browse/SPARK-22554
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
    Affects Versions: 2.3.0
            Reporter: Hyukjin Kwon
            Priority: Trivial


Actually, SparkR already has a flag for {{useDaemon}}:

https://github.com/apache/spark/blob/478fbc866fbfdb4439788583281863ecea14e8af/core/src/main/scala/org/apache/spark/api/r/RRunner.scala#L362

It'd be great if we have this flag too. It makes easier to test Windows 
specific issue.

Also, this is also partly for running Python coverage without extra code 
change. I know a hacky way to run this:

https://github.com/apache/spark/pull/19630#issuecomment-345490662




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to