GitHub user pan3793 added a comment to the discussion: Connecting PySpark with 
Hive tables

One obvious issue - you are using Scala 2.12 
`kyuubi-extension-spark-jdbc-dialect_2.12-1.10.2.jar`, while Spark 4.0 only 
supports Scala 2.13. But I think this won't cause the NPE, the single line 
error message does not help much. (I'm not familiar with PySpark so may not be 
able to provide more guidance for diagnosis)

GitHub link: 
https://github.com/apache/kyuubi/discussions/7240#discussioncomment-14883242

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: 
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to