xuzifu666 commented on PR #9001:
URL: https://github.com/apache/hudi/pull/9001#issuecomment-1656495304

   > @xuzifu666 I am seeing `NoClassDefFoundError: 
org/apache/hudi/com/fasterxml/jackson/datatype/jsr310/JavaTimeModule` while 
running the following command shown below. Any suggestions on how to fix this?
   > 
   > ```
   > spark-3.3.2-bin-hadoop3 % ./bin/spark-submit \
   > --packages org.apache.spark:spark-avro_2.12:3.3.2 \
   > --conf spark.task.cpus=1 \
   > --conf spark.executor.cores=1 \
   > --conf spark.task.maxFailures=100 \
   > --conf spark.memory.fraction=0.4  \
   > --conf spark.rdd.compress=true  \
   > --conf spark.kryoserializer.buffer.max=2000m \
   > --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
   > --conf spark.kryo.registrator=org.apache.spark.HoodieSparkKryoRegistrar \
   > --conf spark.memory.storageFraction=0.1 \
   > --conf spark.shuffle.service.enabled=true  \
   > --conf spark.sql.hive.convertMetastoreParquet=false  \
   > --conf spark.driver.maxResultSize=12g \
   > --conf spark.executor.heartbeatInterval=120s \
   > --conf spark.network.timeout=600s \
   > --conf spark.yarn.max.executor.failures=10 \
   > --conf spark.sql.catalogImplementation=hive \
   > --class org.apache.hudi.integ.testsuite.HoodieTestSuiteJob 
/Users/amrish/code/amrish-hudi-3/packaging/hudi-integ-test-bundle/target/hudi-integ-test-bundle-0.14.0-SNAPSHOT.jar
 \
   > --source-ordering-field test_suite_source_ordering_field \
   > --use-deltastreamer \
   > --target-base-path /tmp/hudi/output \
   > --input-base-path /tmp/hudi/input \
   > --target-table table1 \
   > --props 
file:///Users/amrish/junk/test-aggressive-clean-archival.properties \
   > --workload-yaml-path 
file:///Users/amrish/code/amrish-hudi-3/docker/demo/config/test-suite/simple-deltastreamer.yaml
 \
   > --schemaprovider-class 
org.apache.hudi.integ.testsuite.schema.TestSuiteFileBasedSchemaProvider \
   > --source-class org.apache.hudi.utilities.sources.AvroDFSSource \
   > --input-file-size 125829120 \
   > --workload-generator-classname 
org.apache.hudi.integ.testsuite.dag.WorkflowDagGenerator \
   > --table-type COPY_ON_WRITE \
   > --compact-scheduling-minshare 1 \
   > --clean-input \
   > --clean-output
   > ```
   > 
   > ```
   > Exception in thread "main" org.apache.hudi.exception.HoodieException: 
Failed to run Test Suite 
   >    at 
org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.runTestSuite(HoodieTestSuiteJob.java:221)
   >    at 
org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.main(HoodieTestSuiteJob.java:182)
   >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   >    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   >    at java.lang.reflect.Method.invoke(Method.java:498)
   >    at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
   >    at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
   >    at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
   >    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
   >    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
   >    at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
   >    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
   >    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   > Caused by: java.util.concurrent.ExecutionException: 
java.lang.NoClassDefFoundError: 
org/apache/hudi/com/fasterxml/jackson/datatype/jsr310/JavaTimeModule
   >    at java.util.concurrent.FutureTask.report(FutureTask.java:122)
   >    at java.util.concurrent.FutureTask.get(FutureTask.java:206)
   >    at 
org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler.execute(DagScheduler.java:112)
   >    at 
org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler.schedule(DagScheduler.java:67)
   >    at 
org.apache.hudi.integ.testsuite.HoodieTestSuiteJob.runTestSuite(HoodieTestSuiteJob.java:216)
   >    ... 13 more
   > Caused by: java.lang.NoClassDefFoundError: 
org/apache/hudi/com/fasterxml/jackson/datatype/jsr310/JavaTimeModule
   >    at 
org.apache.hudi.common.util.JsonUtils.registerModules(JsonUtils.java:63)
   >    at 
org.apache.spark.sql.adapter.BaseSpark3Adapter.<init>(BaseSpark3Adapter.scala:51)
   >    at 
org.apache.spark.sql.adapter.Spark3_3Adapter.<init>(Spark3_3Adapter.scala:46)
   >    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   >    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   >    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   >    at java.lang.Class.newInstance(Class.java:442)
   >    at 
org.apache.hudi.SparkAdapterSupport$.sparkAdapter$lzycompute(SparkAdapterSupport.scala:49)
   >    at 
org.apache.hudi.SparkAdapterSupport$.sparkAdapter(SparkAdapterSupport.scala:35)
   >    at 
org.apache.hudi.SparkAdapterSupport.sparkAdapter(SparkAdapterSupport.scala:29)
   >    at 
org.apache.hudi.SparkAdapterSupport.sparkAdapter$(SparkAdapterSupport.scala:29)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$.sparkAdapter$lzycompute(HoodieAnalysis.scala:39)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$.sparkAdapter(HoodieAnalysis.scala:39)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$MatchMergeIntoTable$.unapply(HoodieAnalysis.scala:334)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$AdaptIngestionTargetLogicalRelations$$anonfun$$nestedInanonfun$apply$1$1.applyOrElse(HoodieAnalysis.scala:208)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$AdaptIngestionTargetLogicalRelations$$anonfun$$nestedInanonfun$apply$1$1.applyOrElse(HoodieAnalysis.scala:205)
   >    at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584)
   >    at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
   >    at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:584)
   >    at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
   >    at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
   >    at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
   >    at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
   >    at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
   >    at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:560)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$AdaptIngestionTargetLogicalRelations.$anonfun$apply$1(HoodieAnalysis.scala:205)
   >    at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$AdaptIngestionTargetLogicalRelations.apply(HoodieAnalysis.scala:205)
   >    at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$AdaptIngestionTargetLogicalRelations.apply(HoodieAnalysis.scala:201)
   > ```
   
   what is your hudi version?  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to