GitHub user zhengsg deleted a comment on the discussion: How to configure 
SparkSQL to access the Hive catalog in Gravitino?

I believe the version of my Hive JAR files is correct. I copied the necessary 
JARs one by one from the Hive metastore service's runtime directory to a 
separate directory for SparkSQL to operate on Hive tables. Then, I assigned 
this directory path to spark.sql.hive.metastore.jars. So where exactly does the 
problem lie? I suggest that the Gravitino Spark plugin should focus on ensuring 
proper compatibility and adaptation, rather than relying on configuring 
spark.sql.hive.metastore.jars, as this approach is prone to compatibility 
conflicts with Spark.

GitHub link: 
https://github.com/apache/gravitino/discussions/9161#discussioncomment-15010333

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to