pratyakshsharma opened a new issue, #6422:
URL: https://github.com/apache/hudi/issues/6422

   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? yes
   
   **Describe the problem you faced**
   
   When running the hudi build on master branch without any maven build option 
as described here - 
https://github.com/apache/hudi#build-with-different-flink-versions, it does not 
pick up the default activated profile of flink1.14. Instead it tries to look 
for 1.13 versions of the artifacts. 
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   Run the build using the command - `mvn clean install -DskipTests -DskipITs`
   
   **Expected behavior**
   
   Build should pass and it should not look for 1.13 versions of artifacts.
   
   
   **Additional context**
   
   Build works fine when some maven build option is provided like `mvn clean 
install -DskipTests -DskipITs -Dflink1.13`
   
   **Stacktrace**
   
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-streaming-java/1.13.6/flink-streaming-java-1.13.6.jar
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-clients/1.13.6/flink-clients-1.13.6.jar
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-table-runtime_2.11/1.13.6/flink-table-runtime_2.11-1.13.6.jar
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-parquet/1.13.6/flink-parquet-1.13.6.jar
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-test-utils/1.13.6/flink-test-utils-1.13.6.jar
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-runtime/1.13.6/flink-runtime-1.13.6-tests.jar
   Downloading from cloudera-repo-releases: 
https://repository.cloudera.com/artifactory/public/org/apache/flink/flink-streaming-java/1.13.6/flink-streaming-java-1.13.6-tests.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-streaming-java/1.13.6/flink-streaming-java-1.13.6.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-clients/1.13.6/flink-clients-1.13.6.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-table-runtime_2.11/1.13.6/flink-table-runtime_2.11-1.13.6.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-parquet/1.13.6/flink-parquet-1.13.6.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-test-utils/1.13.6/flink-test-utils-1.13.6.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-runtime/1.13.6/flink-runtime-1.13.6-tests.jar
   Downloading from confluent: 
https://packages.confluent.io/maven/org/apache/flink/flink-streaming-java/1.13.6/flink-streaming-java-1.13.6-tests.jar
   [INFO] 
------------------------------------------------------------------------
   [INFO] Reactor Summary for Hudi 0.13.0-SNAPSHOT:
   [INFO] 
   [INFO] Hudi ............................................... SUCCESS [  4.578 
s]
   [INFO] hudi-common ........................................ SUCCESS [ 41.056 
s]
   [INFO] hudi-hadoop-mr ..................................... SUCCESS [  6.917 
s]
   [INFO] hudi-sync-common ................................... SUCCESS [  1.766 
s]
   [INFO] hudi-hive-sync ..................................... SUCCESS [  5.112 
s]
   [INFO] hudi-aws ........................................... SUCCESS [  3.034 
s]
   [INFO] hudi-timeline-service .............................. SUCCESS [  1.704 
s]
   [INFO] hudi-client ........................................ SUCCESS [  0.133 
s]
   [INFO] hudi-client-common ................................. SUCCESS [ 12.658 
s]
   [INFO] hudi-spark-client .................................. SUCCESS [ 39.069 
s]
   [INFO] hudi-spark-datasource .............................. SUCCESS [  0.119 
s]
   [INFO] hudi-spark-common_2.11 ............................. SUCCESS [ 56.651 
s]
   [INFO] hudi-spark2_2.11 ................................... SUCCESS [ 34.604 
s]
   [INFO] hudi-spark2-common ................................. SUCCESS [  0.197 
s]
   [INFO] hudi-java-client ................................... SUCCESS [  3.356 
s]
   [INFO] hudi-spark_2.11 .................................... SUCCESS [01:29 
min]
   [INFO] hudi-utilities_2.11 ................................ SUCCESS [  6.604 
s]
   [INFO] hudi-utilities-bundle_2.11 ......................... SUCCESS [ 22.392 
s]
   [INFO] hudi-cli ........................................... SUCCESS [ 21.213 
s]
   [INFO] hudi-flink-client .................................. FAILURE [ 29.139 
s]
   [INFO] hudi-gcp ........................................... SKIPPED
   [INFO] hudi-datahub-sync .................................. SKIPPED
   [INFO] hudi-adb-sync ...................................... SKIPPED
   [INFO] hudi-sync .......................................... SKIPPED
   [INFO] hudi-hadoop-mr-bundle .............................. SKIPPED
   [INFO] hudi-datahub-sync-bundle ........................... SKIPPED
   [INFO] hudi-hive-sync-bundle .............................. SKIPPED
   [INFO] hudi-aws-bundle .................................... SKIPPED
   [INFO] hudi-gcp-bundle .................................... SKIPPED
   [INFO] hudi-spark-bundle_2.11 ............................. SKIPPED
   [INFO] hudi-presto-bundle ................................. SKIPPED
   [INFO] hudi-utilities-slim-bundle_2.11 .................... SKIPPED
   [INFO] hudi-timeline-server-bundle ........................ SKIPPED
   [INFO] hudi-trino-bundle .................................. SKIPPED
   [INFO] hudi-examples ...................................... SKIPPED
   [INFO] hudi-examples-common ............................... SKIPPED
   [INFO] hudi-examples-spark ................................ SKIPPED
   [INFO] hudi-flink-datasource .............................. SKIPPED
   [INFO] hudi-flink1.13.x ................................... SKIPPED
   [INFO] hudi-flink ......................................... SKIPPED
   [INFO] hudi-examples-flink ................................ SKIPPED
   [INFO] hudi-examples-java ................................. SKIPPED
   [INFO] hudi-flink1.14.x ................................... SKIPPED
   [INFO] hudi-flink1.15.x ................................... SKIPPED
   [INFO] hudi-kafka-connect ................................. SKIPPED
   [INFO] hudi-flink1.13-bundle .............................. SKIPPED
   [INFO] hudi-kafka-connect-bundle .......................... SKIPPED
   [INFO] 
------------------------------------------------------------------------
   [INFO] BUILD FAILURE
   [INFO] 
------------------------------------------------------------------------
   [INFO] Total time:  06:21 min
   [INFO] Finished at: 2022-08-17T17:31:58+05:30
   [INFO] 
------------------------------------------------------------------------
   [ERROR] Failed to execute goal on project hudi-flink-client: Could not 
resolve dependencies for project 
org.apache.hudi:hudi-flink-client:jar:0.13.0-SNAPSHOT: The following artifacts 
could not be resolved: org.apache.flink:flink-streaming-java:jar:1.13.6, 
org.apache.flink:flink-clients:jar:1.13.6, 
org.apache.flink:flink-table-runtime_2.11:jar:1.13.6, 
org.apache.flink:flink-parquet:jar:1.13.6, 
org.apache.flink:flink-test-utils:jar:1.13.6, 
org.apache.flink:flink-runtime:jar:tests:1.13.6, 
org.apache.flink:flink-streaming-java:jar:tests:1.13.6: Could not find artifact 
org.apache.flink:flink-streaming-java:jar:1.13.6 in Maven Central 
(https://repo.maven.apache.org/maven2) -> [Help 1]
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to