chrisfw commented on issue #3163:
URL: https://github.com/apache/kyuubi/issues/3163#issuecomment-1561252663

   @pan3793 , thanks for your quick response.  The error using occurs in the 
combo of iceberg + Spark 3.3.2 and is not a kyuubi error.  
   
   I added these iceberg+s3 dependencies to the driver and executor 
extraClassPath in spark-defaults.conf:
   
   ```
   annotations-2.20.7.jar
   aws-java-sdk-bundle-1.11.563.jar
   bundle-2.20.7.jar
   eventstream-1.0.1.jar
   hadoop-aws-3.3.0.jar
   hamcrest-core-1.3.jar
   http-client-spi-2.20.7.jar
   iceberg-spark-runtime-3.3_2.13-1.2.1.jar
   junit-4.11.jar
   metrics-spi-2.20.7.jar
   reactive-streams-1.0.3.jar
   slf4j-api-1.7.30.jar
   url-connection-client-2.20.7.jar
   utils-2.20.7.jar
   wildfly-openssl-1.0.7.Final.jar
   ```
   I also added the iceberg catalog properties:
   
   ```
   spark.sql.catalog.my_catalog=org.apache.iceberg.spark.SparkCatalog
   spark.sql.catalog.my_catalog.warehouse=s3a://warehouse
   spark.sql.catalog.my_catalog.io-impl=org.apache.iceberg.aws.s3.S3FileIO
   spark.sql.catalog.my_catalog.s3.endpoint=http://10.18.102.29:9000
   spark.sql.catalog.my_catalog.catalog-impl=org.apache.iceberg.jdbc.JdbcCatalog
   spark.sql.catalog.my_catalog.uri=jdbc:postgresql://10.18.102.29:5432/iceberg
   spark.sql.catalog.my_catalog.jdbc.verifyServerCertificate=false
   spark.sql.catalog.my_catalog.jdbc.user=spark
   spark.sql.catalog.my_catalog.jdbc.useSSL=false
   spark.sql.catalog.my_catalog.jdbc.password=spark
   
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
   ```
   The addition of the Iceberg SQL Extension appears to cause a class not found 
error:
   
   ```
   [spark@demo-sds-server1 ~]$ spark-sql
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   23/05/24 14:05:06 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   23/05/24 14:05:09 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout 
does not exist
   23/05/24 14:05:09 WARN HiveConf: HiveConf of name hive.stats.retries.wait 
does not exist
   Spark master: local[*], Application Id: local-1684937107783
   spark-sql> use my_catalog.iceberg;
   23/05/24 14:05:37 ERROR SparkSQLDriver: Failed in [use my_catalog.iceberg]
   java.lang.NoClassDefFoundError: scala/$less$colon$less
   ```
   Googling the error, I found a 
[post](https://community.snowflake.com/s/question/0D53r0000BwiXlhCQE/getting-error-when-trying-to-use-snowflake-spark-connector-py4jprotocolpy4jjavaerror-an-error-occurred-while-calling-o43csv-javalangnoclassdeffounderror-scalalesscolonless)
 indicating the Scala version issue.  
   
   When I updated my setup to Spark 3.3.2 with Scala 2.13, there was no class 
not found error and the Iceberg catalog, table, and spark procedure 
functionality all worked properly.
   ```
   spark-sql> CALL my_catalog.system.create_changelog_view(table => 
'iceberg.mysample');
   SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
   SLF4J: Defaulting to no-operation (NOP) logger implementation
   SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
   `mysample_changes`
   Time taken: 4.803 seconds, Fetched 1 row(s)
   ```
   If there is a workaround/solution that enables me to utilize the latest 
Iceberg runtime in Spark 3.3.2 with Scala 2.12, that would be great.  Please 
let me know if you require any additional assistance.
   
   Thanks,
   Chris Whelan
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to