Sergey created SPARK-34674:
------------------------------
Summary: Spark app on k8s doesn't terminate without call to
sparkContext.stop() method
Key: SPARK-34674
URL: https://issues.apache.org/jira/browse/SPARK-34674
Project: Spark
Issue Type: Bug
Components: Kubernetes
Affects Versions: 3.1.1
Reporter: Sergey
Hello!
I have run into a problem that if I don't call the method sparkContext.stop()
explicitly, then a Spark driver process doesn't terminate even after its Main
method has been completed. This behaviour is different from spark on yarn,
where the manual sparkContext stopping is not required.
It looks like, the problem is in using non-daemon threads, which prevent the
driver jvm process from terminating.
At least I see two non-daemon threads, if I don't call sparkContext.stop():
{{ }}
{code:java}
Thread[OkHttp kubernetes.default.svc,5,main]
Thread[OkHttp kubernetes.default.svc Writer,5,main]
{code}
{{}}
Could you tell please, if it is possible to solve this problem?
Docker image from the official release of spark-3.1.1 hadoop3.2 is used.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]