[ 
https://issues.apache.org/jira/browse/HUDI-3262?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17480540#comment-17480540
 ] 

sivabalan narayanan commented on HUDI-3262:
-------------------------------------------

Steps to reproduce in latest master. 
 # mvn package (mvn package -DskipTests)
 # setup docker demo 
 # copy required files to adhoc-2
 ## docker cp 
packaging/hudi-integ-test-bundle/target/hudi-integ-test-bundle-0.11.0-SNAPSHOT.jar
 adhoc-2:/opt/

 ## docker cp demo/config/test-suite/complex-dag-cow.yaml adhoc-2:/opt/

 ## docker cp demo/config/test-suite/test.properties adhoc-2:/opt/

 # execute integ test yaml from within adhoc-2
 ## docker exec -it adhoc-2 /bin/bash
 ## 
{code:java}
spark-submit --packages org.apache.spark:spark-avro_2.11:2.4.0 --conf 
spark.task.cpus=1 --conf spark.executor.cores=1 --conf 
spark.task.maxFailures=100 --conf spark.memory.fraction=0.4  --conf 
spark.rdd.compress=true  --conf spark.kryoserializer.buffer.max=2000m --conf 
spark.serializer=org.apache.spark.serializer.KryoSerializer --conf 
spark.memory.storageFraction=0.1 --conf spark.shuffle.service.enabled=true  
--conf spark.sql.hive.convertMetastoreParquet=false  --conf 
spark.driver.maxResultSize=12g --conf spark.executor.heartbeatInterval=120s 
--conf spark.network.timeout=600s --conf spark.yarn.max.executor.failures=10 
--conf spark.sql.catalogImplementation=hive --conf 
spark.driver.extraClassPath=/var/demo/jars/* --conf 
spark.executor.extraClassPath=/var/demo/jars/* --class 
org.apache.hudi.integ.testsuite.HoodieTestSuiteJob 
/opt/hudi-integ-test-bundle-0.11.0-SNAPSHOT.jar --source-ordering-field 
test_suite_source_ordering_field --use-deltastreamer --target-base-path 
/user/hive/warehouse/hudi-integ-test-suite/output --input-base-path 
/user/hive/warehouse/hudi-integ-test-suite/input --target-table table1 --props 
test.properties --schemaprovider-class 
org.apache.hudi.integ.testsuite.schema.TestSuiteFileBasedSchemaProvider 
--source-class org.apache.hudi.utilities.sources.AvroDFSSource 
--input-file-size 125829120 --workload-yaml-path file:/opt/complex-dag-cow.yaml 
--workload-generator-classname 
org.apache.hudi.integ.testsuite.dag.WorkflowDagGenerator --table-type 
COPY_ON_WRITE --compact-scheduling-minshare 1 --clean-input --clean-output 
{code}

stacktrace:
{code:java}
22/01/23 02:41:41 WARN DagNode: Validation using data from input path 
/user/hive/warehouse/hudi-integ-test-suite/input/*/*
22/01/23 02:41:46 INFO ValidateDatasetNode: Validate data in target hudi path 
/user/hive/warehouse/hudi-integ-test-suite/output/*/*/*
22/01/23 02:41:46 ERROR DagScheduler: Exception executing node
java.lang.ClassNotFoundException: Failed to find data source: hudi. Please find 
packages at http://spark.apache.org/third-party-projects.html
        at 
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:657)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:194)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
        at 
org.apache.hudi.integ.testsuite.dag.nodes.ValidateDatasetNode.getDatasetToValidate(ValidateDatasetNode.java:52)
        at 
org.apache.hudi.integ.testsuite.dag.nodes.BaseValidateDatasetNode.execute(BaseValidateDatasetNode.java:99)
        at 
org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler.executeNode(DagScheduler.java:139)
        at 
org.apache.hudi.integ.testsuite.dag.scheduler.DagScheduler.lambda$execute$0(DagScheduler.java:105)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: hudi.DefaultSource
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$20$$anonfun$apply$12.apply(DataSource.scala:634)
        at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$20$$anonfun$apply$12.apply(DataSource.scala:634)
        at scala.util.Try$.apply(Try.scala:192)
        at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$20.apply(DataSource.scala:634)
        at 
org.apache.spark.sql.execution.datasources.DataSource$$anonfun$20.apply(DataSource.scala:634)
        at scala.util.Try.orElse(Try.scala:84)
        at 
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:634)
        ... 11 more {code}
This is not an issue w/ 0.10.1 RC2 branch. I guess we are encountering this 
after [https://github.com/apache/hudi/pull/4514] got landed. 

 

 

 

> Integration test suite failure
> ------------------------------
>
>                 Key: HUDI-3262
>                 URL: https://issues.apache.org/jira/browse/HUDI-3262
>             Project: Apache Hudi
>          Issue Type: Bug
>          Components: tests-ci
>            Reporter: Raymond Xu
>            Assignee: sivabalan narayanan
>            Priority: Critical
>              Labels: sev:normal
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> detailed in https://github.com/apache/hudi/issues/4621



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to