Github user jiangxb1987 commented on a diff in the pull request:
https://github.com/apache/spark/pull/21567#discussion_r196431134
--- Diff:
core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala ---
@@ -354,7 +355,8 @@ private[spark] abstract class BasePythonRunner[IN, OUT](
extends Thread(s"Worker Monitor for $pythonExec") {
/** How long to wait before killing the python worker if a task cannot
be interrupted. */
- private val taskKillTimeout =
env.conf.getTimeAsMs("spark.python.task.killTimeout", "2s")
+ private val taskKillTimeoutMs =
env.conf.getTimeAsSeconds("spark.python.task.killTimeout",
--- End diff --
The conf is not documented, but I think it's designed to accept values like
1.5s
cc @zsxwing
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]