See
<https://ci-beam.apache.org/job/beam_PostCommit_Java/6904/display/redirect?page=changes>
Changes:
[frank] Warn if temp dataset cleanup permission is denied
[piotr.szuberski] [BEAM-7003 BEAM-8639 BEAM-8774] Update Kafka dependencies and
add tests
[noreply] Merge pull request #13170 from [BEAM-9650] Adding support for ReadAll
[noreply] Merge pull request #13137 from [BEAM-11073] Dicom IO Connector for
Java
------------------------------------------
[...truncated 71.62 KB...]
> Task :sdks:java:extensions:google-cloud-platform-core:integrationTestKms
> Task :sdks:java:extensions:google-cloud-platform-core:postCommit
> Task :sdks:java:io:google-cloud-platform:compileTestJava
> Task :sdks:java:io:kafka:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:io:kafka:testClasses
> Task :sdks:java:extensions:sorter:hadoopVersion285Test
> Task :sdks:java:io:kafka:kafkaVersion01103BatchIT
> Task :sdks:java:io:parquet:hadoopVersion285Test
> Task :examples:java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :examples:java:classes
> Task :examples:java:jar
> Task :runners:google-cloud-dataflow-java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :examples:java:compileTestJava
> Task :runners:core-java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :runners:core-java:testClasses
> Task :runners:core-java:testJar
> Task :sdks:java:extensions:sorter:hadoopVersion292Test
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava
> Task :sdks:java:io:parquet:hadoopVersion292Test
> Task :runners:spark:compileTestJava
> Task :examples:java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :examples:java:testClasses
> Task :examples:java:testJar
> Task :sdks:java:io:hadoop-file-system:hadoopVersion285Test
> Task :sdks:java:io:kafka:kafkaVersion01103Test
> Task :sdks:java:io:hadoop-format:compileTestJava
> Task :sdks:java:extensions:sorter:hadoopVersion321Test
> Task :sdks:java:io:parquet:hadoopVersion321Test
> Task :sdks:java:io:hcatalog:hadoopVersion285Test
> Task :runners:spark:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:io:hadoop-format:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
> Task :sdks:java:io:hadoop-format:testClasses
> Task :sdks:java:io:parquet:hadoopVersionsTest
> Task :sdks:java:extensions:sorter:hadoopVersionsTest
> Task :runners:spark:testClasses
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion100BatchIT
> Task :runners:spark:hadoopVersion2101Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersion292Test
> Task :sdks:java:io:kafka:kafkaVersion100Test
> Task :sdks:java:io:google-cloud-platform:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:io:google-cloud-platform:testClasses
> Task :sdks:java:io:google-cloud-platform:testJar
> Task :sdks:java:io:google-cloud-platform:integrationTest
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101IT
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.healthcare.DicomIOReadIT > testDicomMetadataRead
FAILED
org.apache.beam.sdk.Pipeline$PipelineExecutionException at
DicomIOReadIT.java:83
Caused by: java.lang.ArrayIndexOutOfBoundsException at
WebPathParser.java:58
> Task :sdks:java:io:hcatalog:hadoopVersion292Test
> Task :sdks:java:io:kafka:kafkaVersion111BatchIT
> Task :sdks:java:io:hadoop-file-system:hadoopVersion321Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101Test
> Task :sdks:java:io:kafka:kafkaVersion111Test
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
> Task
> :runners:google-cloud-dataflow-java:coreSDKJavaLegacyWorkerIntegrationTest
> NO-SOURCE
> Task :sdks:java:io:hadoop-file-system:hadoopVersionsTest
> Task :sdks:java:extensions:zetasketch:integrationTest
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
> Task :sdks:java:io:hcatalog:hadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion201BatchIT
> Task :runners:spark:hadoopVersion285Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion201Test
> Task :sdks:java:io:kafka:kafkaVersion211BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285IT
> Task :sdks:java:io:kafka:kafkaVersion211Test
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1130181552-1253349c
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1130181552-1253349c
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_routes_1606760145697.traffic_routes_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1130181552-1253349c
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1130181552-1253349c
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_routes_1606760145697.traffic_routes_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :sdks:java:io:google-cloud-platform:integrationTest
41 tests completed, 1 failed
> Task :sdks:java:io:google-cloud-platform:integrationTest FAILED
> Task :runners:spark:hadoopVersion292Test
> Task :sdks:java:io:google-cloud-platform:integrationTestKms
> Task :sdks:java:io:kafka:kafkaVersion222BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285Test
> Task :sdks:java:io:kafka:kafkaVersion222Test
> Task :sdks:java:io:kafka:kafkaVersion231BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292ElasticIT
> Task :runners:spark:hadoopVersion321Test
> Task :sdks:java:io:kafka:kafkaVersion231Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292IT
> Task :sdks:java:io:kafka:kafkaVersion241BatchIT
> Task :runners:spark:hadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion241Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292Test
> Task :sdks:java:io:kafka:kafkaVersion251BatchIT
> Task :sdks:java:io:kafka:kafkaVersion251Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321ElasticIT
> Task :sdks:java:io:kafka:kafkaVersionsCompatibilityTest
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1130182343-73431716
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1130182343-73431716
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_max_lane_flow_1606760622562.traffic_max_lane_flow_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1130182343-73431716
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1130182343-73431716
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_max_lane_flow_1606760622562.traffic_max_lane_flow_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321IT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321Test
> Task :sdks:java:io:hadoop-format:hadoopVersionsTest
> Task :javaHadoopVersionsTest
> Task :sdks:java:extensions:zetasketch:postCommit
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1130183413-531414fc
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1130183413-531414fc
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:beam_examples.testpipeline_jenkins_1130183413_531414fc
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
org.apache.beam.sdk.io.gcp.healthcare.DicomIOReadIT > testDicomMetadataRead
FAILED
java.nio.file.NoSuchFileException at DicomIOReadIT.java:55
42 tests completed, 1 failed
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
> FAILED
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerKmsIntegrationTest
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:io:google-cloud-platform:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/io/google-cloud-platform/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformLegacyWorkerIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 36m 20s
217 actionable tasks: 188 executed, 29 from cache
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/szmymafs75ryo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]