georgepap9808 opened a new issue, #10262:
URL: https://github.com/apache/hudi/issues/10262

   **Problem**
   I have an error when I try to build Hudi as described in the docker demo.
   Using the command 
   ```mvn clean package -Pintegration-tests -DskipTests``` and
   ```mvn clean install package -Pintegration-tests -DskipTests```
   fails to build at hudi-spark_2.12
   Also I should mention that without -Pintegration-tests in the command above 
build is successfull
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. Clone hudi repo at default master branch
   2. Run ```mvn clean package -Pintegration-tests -DskipTests```
   
   
   **Expected behavior**
   
   Run all 66 tests
   
   **Environment Description**
   
   * Hudi version : default 
   
   * Spark version :
   
   * Hive version :
   
   * Hadoop version :
   
   * Storage (HDFS/S3/GCS..) :
   
   * Running on Docker? (yes/no) : yes
   
   
   **Additional context**
   
   As mentioned above without -Pintegration-tests build is successful but then 
I have a problem at ''Step 2: Incrementally ingest data from Kafka topic'' 
where i get the error "java.lang.ClassNotFoundException: 
org.apache.hudi.utilities.streamer.HoodieStreamer" after running spark-submit 
command with the arguments like in the docker demo documentation, so I assume 
these 2 are connected.
   
   **Stacktrace**
   
   ```INFO] ------------------< org.apache.hudi:hudi-spark_2.12 
>-------------------
   [INFO] Building hudi-spark_2.12 1.0.0-SNAPSHOT                          
[15/66]
   [INFO] --------------------------------[ jar 
]---------------------------------
   [WARNING] The POM for 
org.apache.hudi:hudi-spark3.2.x_2.12:jar:1.0.0-SNAPSHOT is missing, no 
dependency information available
   [INFO] 
------------------------------------------------------------------------
   [INFO] Reactor Summary for Hudi 1.0.0-SNAPSHOT:
   [INFO] 
   [INFO] Hudi ............................................... SUCCESS [  3.791 
s]
   [INFO] hudi-tests-common .................................. SUCCESS [  4.193 
s]
   [INFO] hudi-common ........................................ SUCCESS [ 30.975 
s]
   [INFO] hudi-hadoop-mr ..................................... SUCCESS [  8.366 
s]
   [INFO] hudi-sync-common ................................... SUCCESS [  2.949 
s]
   [INFO] hudi-hive-sync ..................................... SUCCESS [  8.736 
s]
   [INFO] hudi-aws ........................................... SUCCESS [  6.843 
s]
   [INFO] hudi-timeline-service .............................. SUCCESS [  3.427 
s]
   [INFO] hudi-client ........................................ SUCCESS [  0.171 
s]
   [INFO] hudi-client-common ................................. SUCCESS [ 12.853 
s]
   [INFO] hudi-spark-client .................................. SUCCESS [ 57.661 
s]
   [INFO] hudi-spark-datasource .............................. SUCCESS [  0.145 
s]
   [INFO] hudi-spark-common_2.12 ............................. SUCCESS [01:27 
min]
   [INFO] hudi-java-client ................................... SUCCESS [  7.141 
s]
   [INFO] hudi-spark_2.12 .................................... FAILURE [  0.189 
s]
   [INFO] hudi-gcp ........................................... SKIPPED
   [INFO] hudi-utilities_2.12 ................................ SKIPPED
   [INFO] hudi-utilities-bundle_2.12 ......................... SKIPPED
   [INFO] hudi-cli ........................................... SKIPPED
   [INFO] hudi-flink-client .................................. SKIPPED
   [INFO] hudi-datahub-sync .................................. SKIPPED
   [INFO] hudi-adb-sync ...................................... SKIPPED
   [INFO] hudi-sync .......................................... SKIPPED
   [INFO] hudi-hadoop-mr-bundle .............................. SKIPPED
   [INFO] hudi-datahub-sync-bundle ........................... SKIPPED
   [INFO] hudi-hive-sync-bundle .............................. SKIPPED
   [INFO] hudi-aws-bundle .................................... SKIPPED
   [INFO] hudi-gcp-bundle .................................... SKIPPED
   [INFO] hudi-spark-bundle_2.12 ............................. SKIPPED
   [INFO] hudi-presto-bundle ................................. SKIPPED
   [INFO] hudi-utilities-slim-bundle_2.12 .................... SKIPPED
   [INFO] hudi-timeline-server-bundle ........................ SKIPPED
   [INFO] hudi-trino-bundle .................................. SKIPPED
   [INFO] hudi-examples ...................................... SKIPPED
   [INFO] hudi-examples-common ............................... SKIPPED
   [INFO] hudi-examples-spark ................................ SKIPPED
   [INFO] hudi-flink-datasource .............................. SKIPPED
   [INFO] hudi-flink1.18.x ................................... SKIPPED
   [INFO] hudi-flink ......................................... SKIPPED
   [INFO] hudi-examples-flink ................................ SKIPPED
   [INFO] hudi-examples-java ................................. SKIPPED
   [INFO] hudi-flink1.14.x ................................... SKIPPED
   [INFO] hudi-flink1.15.x ................................... SKIPPED
   [INFO] hudi-flink1.16.x ................................... SKIPPED
   [INFO] hudi-flink1.17.x ................................... SKIPPED
   [INFO] hudi-kafka-connect ................................. SKIPPED
   [INFO] hudi-flink1.18-bundle .............................. SKIPPED
   [INFO] hudi-kafka-connect-bundle .......................... SKIPPED
   [INFO] hudi-cli-bundle_2.12 ............................... SKIPPED
   [INFO] hudi-hadoop-docker ................................. SKIPPED
   [INFO] hudi-hadoop-base-docker ............................ SKIPPED
   [INFO] hudi-hadoop-base-java11-docker ..................... SKIPPED
   [INFO] hudi-hadoop-namenode-docker ........................ SKIPPED
   [INFO] hudi-hadoop-datanode-docker ........................ SKIPPED
   [INFO] hudi-hadoop-history-docker ......................... SKIPPED
   [INFO] hudi-hadoop-hive-docker ............................ SKIPPED
   [INFO] hudi-hadoop-sparkbase-docker ....................... SKIPPED
   [INFO] hudi-hadoop-sparkmaster-docker ..................... SKIPPED
   [INFO] hudi-hadoop-sparkworker-docker ..................... SKIPPED
   [INFO] hudi-hadoop-sparkadhoc-docker ...................... SKIPPED
   [INFO] hudi-hadoop-presto-docker .......................... SKIPPED
   [INFO] hudi-hadoop-trinobase-docker ....................... SKIPPED
   [INFO] hudi-hadoop-trinocoordinator-docker ................ SKIPPED
   [INFO] hudi-hadoop-trinoworker-docker ..................... SKIPPED
   [INFO] hudi-integ-test .................................... SKIPPED
   [INFO] hudi-integ-test-bundle ............................. SKIPPED
   [INFO] 
------------------------------------------------------------------------
   [INFO] BUILD FAILURE
   [INFO] 
------------------------------------------------------------------------
   [INFO] Total time:  03:56 min
   [INFO] Finished at: 2023-12-06T21:25:57+02:00
   [INFO] 
------------------------------------------------------------------------
   [ERROR] Failed to execute goal on project hudi-spark_2.12: Could not resolve 
dependencies for project org.apache.hudi:hudi-spark_2.12:jar:1.0.0-SNAPSHOT: 
   Failure to find org.apache.hudi:hudi-spark3.2.x_2.12:jar:1.0.0-SNAPSHOT in 
   https://packages.confluent.io/maven/ was cached in the local repository, 
resolution will not be reattempted until the update interval of confluent has 
elapsed or updates are forced -> [Help 1]
   [ERROR] 
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR] 
   [ERROR] For more information about the errors and possible solutions, please 
read the following articles:
   [ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
   [ERROR] 
   [ERROR] After correcting the problems, you can resume the build with the 
command
   [ERROR]   mvn <args> -rf :hudi-spark_2.12
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to