alberttwong opened a new issue, #11850:
URL: https://github.com/apache/hudi/issues/11850
**Describe the problem you faced**
The library dependancies are too old. Spark 3.5 and other rely on Hadoop
3.3.4 or 3.4 and Hadoop AWS 3.3.4 or newer.
**To Reproduce**
Steps to reproduce the behavior:
You get errors like
```
Exception in thread "main" org.apache.hudi.hive.HoodieHiveSyncException: Got
runtime exception when hive syncing
at
org.apache.hudi.hive.HiveSyncTool.initSyncClient(HiveSyncTool.java:130)
at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:119)
at org.apache.hudi.hive.HiveSyncTool.main(HiveSyncTool.java:480)
Caused by: org.apache.hudi.exception.HoodieException: Unable to create
org.apache.hudi.storage.hadoop.HoodieHadoopStorage
at
org.apache.hudi.storage.HoodieStorageUtils.getStorage(HoodieStorageUtils.java:44)
at
org.apache.hudi.common.table.HoodieTableMetaClient.getStorage(HoodieTableMetaClient.java:309)
at
org.apache.hudi.common.table.HoodieTableMetaClient.access$000(HoodieTableMetaClient.java:81)
at
org.apache.hudi.common.table.HoodieTableMetaClient$Builder.build(HoodieTableMetaClient.java:770)
at
org.apache.hudi.sync.common.HoodieSyncClient.<init>(HoodieSyncClient.java:68)
at
org.apache.hudi.hive.HoodieHiveSyncClient.<init>(HoodieHiveSyncClient.java:84)
at
org.apache.hudi.hive.HiveSyncTool.initSyncClient(HiveSyncTool.java:125)
... 2 more
Caused by: org.apache.hudi.exception.HoodieException: Unable to instantiate
class org.apache.hudi.storage.hadoop.HoodieHadoopStorage
at
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:75)
at
org.apache.hudi.storage.HoodieStorageUtils.getStorage(HoodieStorageUtils.java:41)
... 8 more
Caused by: java.lang.reflect.InvocationTargetException
at
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
at
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
Source)
at
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
Source)
at java.base/java.lang.reflect.Constructor.newInstance(Unknown
Source)
at
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:73)
... 9 more
Caused by: java.lang.NoClassDefFoundError:
org/apache/hadoop/fs/statistics/IOStatisticsSource
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(Unknown Source)
at java.base/java.security.SecureClassLoader.defineClass(Unknown
Source)
at
java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(Unknown Source)
at
java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(Unknown
Source)
at
java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(Unknown Source)
```
or
```
Exception in thread "main" java.io.IOException: From option
fs.s3a.aws.credentials.provider java.lang.ClassNotFoundException: Class
org.apache.hadoop.fs.s3a.auth.IAMInstanceCredentialsProvider not found
at
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:376)
at
org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:51)
at
org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:229)
at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3469)
at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3574)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3521)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:540)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:288)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:524)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
at
org.apache.spark.sql.internal.SharedState$.qualifyWarehousePath(SharedState.scala:288)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:149)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: Class
org.apache.hadoop.fs.s3a.auth.IAMInstanceCredentialsProvider not found
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2592)
at
org.apache.hadoop.conf.Configuration.getClasses(Configuration.java:2663)
at
org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:373)
```
**Expected behavior**
Able to use a single set of Hadoop libraries to support Hadoop and Spark and
Hudi and Hudi utilities.
**Environment Description**
* Hudi version :
* Spark version :
* Hive version :
* Hadoop version :
* Storage (HDFS/S3/GCS..) :
* Running on Docker? (yes/no) :
**Additional context**
Add any other context about the problem here.
**Stacktrace**
```Add the stacktrace of the error.```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]