medina325 commented on issue #6716:
URL: https://github.com/apache/kyuubi/issues/6716#issuecomment-3035239759

   That's good to know. However, I just can't seem to configure Apache Kyuubi 
properly to work with my standalone Spark cluster.
   
   I can't, for the life of me, figure out how to properly set the env vars 
like:
   
   - JAVA_HOME
   - SPARK_HOME
   - SPARK_ENGINE_HOME
   - HIVE_HOME
   - HADOOP_CONF_DIR
   
   Should these paths refer to Kyuubi's container filesystem or the Spark 
worker’s?
   
   I’m using:
   - bitnami/spark:3.5 for Spark
   - apache/kyuubi:1.9.4-all for Kyuubi
   
   If I use the built-in spark binaries in the Kyuubi's image, then I am able 
to establish a connection, but if I set `spark.master` to point to my Spark 
cluster's master node, I start getting all kinds of problems.
   
   Right now I'm stuck at `...org.apache.kyuubi.KyuubiSQLException: 
org.apache.kyuubi.KyuubiSQLException: Exception in thread "main" 
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf`.
   
   Any help would be really appreciated!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscr...@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscr...@kyuubi.apache.org
For additional commands, e-mail: notifications-h...@kyuubi.apache.org

Reply via email to