Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19845
@imatiach-msft, ah, I think it's not about SparkContext but SparkSession,
(SparkSession(...) directly) to be more clear, which seems causing multiple
Hive clients when Hive support is enabled.
I think the most of PySpark's unit tests do not enable Hive's support by
default. Also, I double checked and I ran this multiple times and it seems fine
after this fix as well.
Let me keep my eyes on Jenkins tests about this. Basically the tests are
ran in parallel on Jenkins.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]