Riddle4045 opened a new issue, #9505:
URL: https://github.com/apache/hudi/issues/9505

   Forked Hudi repo, built branch **release-0.13.0** with tag **release-0.13.0**
   set **SPARK_HOME** to **/home/c/git/hudi/spark/spark-3.3.1-bin-hadoop2** ( 
spark installation directory )
   
   After successful build  - cd into hudi-cli/ project and run **hudi-cli.sh** 
- I can connect to the table just fine
   
   
![Untitled](https://github.com/apache/hudi/assets/3648351/8d700061-f9d9-4314-970d-0620b7f0f15a)
   
   
   
   After, running **compaction schedule**, I see the following error
   
   ```
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 ERROR SparkMain: 
Fail to execute commandString
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : java.lang.NoSuchMethodError: 
org.apache.hadoop.security.ProviderUtils.excludeIncompatibleCredentialProviders(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/Class;)Lorg/apache/hadoop/conf/Configuration;
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.azurebfs.AbfsConfiguration.<init>(AbfsConfiguration.java:189)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.<init>(AzureBlobFileSystemStore.java:139)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.initialize(AzureBlobFileSystem.java:105)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.common.fs.FSUtils.getFs(FSUtils.java:110)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.common.table.HoodieTableMetaClient.getFs(HoodieTableMetaClient.java:305)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:136)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.common.table.HoodieTableMetaClient.newMetaClient(HoodieTableMetaClient.java:689)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.common.table.HoodieTableMetaClient.access$000(HoodieTableMetaClient.java:81)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.common.table.HoodieTableMetaClient$Builder.build(HoodieTableMetaClient.java:770)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.utilities.UtilHelpers.createMetaClient(UtilHelpers.java:539)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.utilities.HoodieCompactor.<init>(HoodieCompactor.java:71)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.cli.commands.SparkMain.compact(SparkMain.java:420)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.hudi.cli.commands.SparkMain.main(SparkMain.java:183)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
java.lang.reflect.Method.invoke(Method.java:498)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
   2023-08-22 16:41:03.175  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            :   at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   2023-08-22 16:41:03.185  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 INFO SparkUI: 
Stopped Spark web UI at http://172.28.169.142:4040
   2023-08-22 16:41:03.194  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 INFO 
MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
   2023-08-22 16:41:03.202  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 INFO MemoryStore: 
MemoryStore cleared
   2023-08-22 16:41:03.202  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 INFO BlockManager: 
BlockManager stopped
   2023-08-22 16:41:03.207  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 INFO 
BlockManagerMaster: BlockManagerMaster stopped
   2023-08-22 16:41:03.209  INFO 412571 --- [      Thread-17] 
o.a.h.c.u.InputStreamConsumer            : 23/08/22 16:41:03 INFO 
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to