LuciferYang commented on code in PR #40389: URL: https://github.com/apache/spark/pull/40389#discussion_r1135297547
########## connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/IntegrationTestUtils.scala: ########## @@ -49,6 +51,12 @@ object IntegrationTestUtils { // scalastyle:on println private[connect] def debug(error: Throwable): Unit = if (isDebug) error.printStackTrace() + private[sql] lazy val sparkHiveJarAvailable: Boolean = { + val filePath = s"$sparkHome/assembly/target/$scalaDir/jars/" + + s"spark-hive_$scalaVersion-${org.apache.spark.SPARK_VERSION}.jar" Review Comment: @HyukjinKwon It is not ok to try to get the hive-related class on the client side(like `Utils.classIsLoadable("org.apache.hadoop.hive.conf.HiveConf")`), the result should always be false. `SimpleSparkConnectService` can find `spark-hive_**.jar` in classpath because of the following code: https://github.com/apache/spark/blob/93334e295483a0ba66e22d8398512ad970a3ea80/bin/spark-class#L39-L51 But the connect client itself does not trigger the above behavior, so the client will not see the hive-related classes... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org