[
https://issues.apache.org/jira/browse/SPARK-37004?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon resolved SPARK-37004.
----------------------------------
Fix Version/s: 3.2.1
3.3.0
Assignee: Hyukjin Kwon
Resolution: Fixed
Fixed in https://github.com/apache/spark/pull/34814
> Job cancellation causes py4j errors on Jupyter due to pinned thread mode
> ------------------------------------------------------------------------
>
> Key: SPARK-37004
> URL: https://issues.apache.org/jira/browse/SPARK-37004
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 3.2.0
> Reporter: Xiangrui Meng
> Assignee: Hyukjin Kwon
> Priority: Blocker
> Fix For: 3.2.1, 3.3.0
>
> Attachments: pinned.ipynb
>
>
> Spark 3.2.0 turned on py4j pinned thread mode by default (SPARK-35303).
> However, in a jupyter notebook, after I cancel (interrupt) a long-running
> Spark job, the next Spark command will fail with some py4j errors. See
> attached notebook for repro.
> Cannot reproduce the issue after I turn off pinned thread mode .
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]