ueshin opened a new pull request, #52749:
URL: https://github.com/apache/spark/pull/52749
### What changes were proposed in this pull request?
Uses a difference error message when kill-on-idle-timeout.
### Why are the changes needed?
Currently the error message when kill-on-idle-timeout is same as when the
Python worker crashes.
```py
>>> from pyspark.sql.functions import udf
>>> import time
>>>
>>> @udf
... def f(x):
... time.sleep(2)
... return str(x)
...
>>> spark.conf.set("spark.sql.execution.pyspark.udf.idleTimeoutSeconds",
"1s")
>>> spark.conf.set("spark.sql.execution.pyspark.udf.killOnIdleTimeout",
"true")
>>>
>>> spark.range(1).select(f("id")).show()
25/10/27 16:31:16 WARN PythonUDFWithNamedArgumentsRunner: Idle timeout
reached for Python worker (timeout: 1 seconds). No data received from the
worker process: handle.map(_.isAlive) = Some(true), channel.isConnected = true,
channel.isBlocking = false, selector.isOpen = true, selectionKey.isValid =
true, selectionKey.interestOps = 1, hasInputs = false
25/10/27 16:31:16 WARN PythonUDFWithNamedArgumentsRunner: Terminating Python
worker process due to idle timeout (timeout: 1 seconds)
25/10/27 16:31:16 ERROR Executor: Exception in task 15.0 in stage 0.0 (TID
15)
org.apache.spark.SparkException: Python worker exited unexpectedly
(crashed). Consider setting
'spark.sql.execution.pyspark.udf.faulthandler.enabled'
or'spark.python.worker.faulthandler.enabled' configuration to 'true' for the
better Python traceback.
...
```
It should show a different message to distinguish the cause:
```py
25/10/27 16:34:55 WARN PythonUDFWithNamedArgumentsRunner: Idle timeout
reached for Python worker (timeout: 1 seconds). No data received from the
worker process: handle.map(_.isAlive) = Some(true), channel.isConnected = true,
channel.isBlocking = false, selector.isOpen = true, selectionKey.isValid =
true, selectionKey.interestOps = 1, hasInputs = false
25/10/27 16:34:55 WARN PythonUDFWithNamedArgumentsRunner: Terminating Python
worker process due to idle timeout (timeout: 1 seconds)
25/10/27 16:34:55 ERROR Executor: Exception in task 15.0 in stage 0.0 (TID
15)
org.apache.spark.SparkException: Python worker process terminated due to
idle timeout (timeout: 1 seconds)
...
```
### Does this PR introduce _any_ user-facing change?
Yes, the error message when kill-on-idle-timeout is different.
### How was this patch tested?
Modified the related tests.
### Was this patch authored or co-authored using generative AI tooling?
No.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]