See
<https://ci-beam.apache.org/job/beam_PostCommit_Java/6909/display/redirect?page=changes>
Changes:
[Robert Bradshaw] [BEAM-11354] Allow DoFn itself to be used as the restriction
provider.
[Robert Bradshaw] [BEAM-11354] Also allow DoFn for WatermarkEstimator.
[Robert Bradshaw] [BEAM-11354] Update docs.
[Robert Bradshaw] Test for watermark tracker.
[Boyuan Zhang] Add splittable dofn as the recommended way of building
connectors.
[Robert Bradshaw] Add a note to the programming guide.
[Robert Bradshaw] Skip tests unimplemented for multiple workers.
------------------------------------------
[...truncated 70.96 KB...]
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
> Task :runners:spark:hadoopVersion2101Test
> Task :sdks:java:extensions:sorter:hadoopVersion2101Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersion2101Test
> Task :sdks:java:io:hcatalog:hadoopVersion2101Test
> Task :sdks:java:io:parquet:hadoopVersion2101Test
> Task :sdks:java:io:kafka:kafkaVersion01103BatchIT
> Task :sdks:java:io:kinesis:integrationTest
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101ElasticIT
> Task :sdks:java:io:google-cloud-platform:integrationTest
> Task :sdks:java:io:parquet:hadoopVersion285Test
> Task :sdks:java:extensions:sorter:hadoopVersion285Test
> Task
> :runners:google-cloud-dataflow-java:coreSDKJavaLegacyWorkerIntegrationTest
> NO-SOURCE
> Task :sdks:java:io:hadoop-file-system:hadoopVersion285Test
> Task :sdks:java:io:parquet:hadoopVersion292Test
> Task :sdks:java:extensions:sorter:hadoopVersion292Test
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.healthcare.DicomIOReadIT > testDicomMetadataRead
FAILED
org.apache.beam.sdk.Pipeline$PipelineExecutionException at
DicomIOReadIT.java:83
Caused by: java.lang.ArrayIndexOutOfBoundsException at
WebPathParser.java:58
> Task :sdks:java:io:hcatalog:hadoopVersion285Test
> Task :sdks:java:io:parquet:hadoopVersion321Test
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testSequentialWrite FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerExceptionFactory.java:210
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testFailFast FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
> Task :sdks:java:extensions:sorter:hadoopVersion321Test
> Task :sdks:java:extensions:zetasketch:integrationTest
> Task :sdks:java:io:kafka:kafkaVersion01103Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersion292Test
> Task :sdks:java:extensions:sorter:hadoopVersionsTest
> Task :sdks:java:io:parquet:hadoopVersionsTest
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101IT
> Task :sdks:java:io:kafka:kafkaVersion100BatchIT
> Task :sdks:java:io:hadoop-file-system:hadoopVersion321Test
> Task :sdks:java:io:kafka:kafkaVersion100Test
> Task :sdks:java:io:hcatalog:hadoopVersion292Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101Test
> Task :runners:spark:hadoopVersion285Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion111BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion111Test
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerExceptionFactory.java:210
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
> Task :sdks:java:io:hcatalog:hadoopVersionsTest
> Task :sdks:java:io:google-cloud-platform:integrationTest FAILED
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
41 tests completed, 8 failed
> Task :sdks:java:io:google-cloud-platform:integrationTestKms
> Task :sdks:java:io:kafka:kafkaVersion201BatchIT
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1202000819-eeff8e8c
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1202000819-eeff8e8c
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_routes_1606867693894.traffic_routes_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1202000819-eeff8e8c
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1202000819-eeff8e8c
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_routes_1606867693894.traffic_routes_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285IT
> Task :sdks:java:io:kafka:kafkaVersion201Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285Test
> Task :runners:spark:hadoopVersion292Test
> Task :sdks:java:io:kafka:kafkaVersion211BatchIT
> Task :sdks:java:io:kafka:kafkaVersion211Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion222BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292IT
> Task :runners:spark:hadoopVersion321Test
> Task :sdks:java:io:kafka:kafkaVersion222Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292Test
> Task :sdks:java:io:kafka:kafkaVersion231BatchIT
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1202001457-127e074f
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1202001457-127e074f
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_max_lane_flow_1606868096650.traffic_max_lane_flow_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1202001457-127e074f
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1202001457-127e074f
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_max_lane_flow_1606868096650.traffic_max_lane_flow_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :sdks:java:io:kafka:kafkaVersion231Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321ElasticIT
> Task :runners:spark:hadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion241BatchIT
> Task :sdks:java:io:kafka:kafkaVersion241Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321IT
> Task :sdks:java:io:kafka:kafkaVersion251BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321Test
> Task :sdks:java:io:kafka:kafkaVersion251Test
> Task :sdks:java:io:kafka:kafkaVersionsCompatibilityTest
> Task :sdks:java:io:hadoop-format:hadoopVersionsTest
> Task :javaHadoopVersionsTest
> Task :sdks:java:extensions:zetasketch:postCommit
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1202002503-fd97b505
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1202002503-fd97b505
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:beam_examples.testpipeline_jenkins_1202002503_fd97b505
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
org.apache.beam.sdk.io.gcp.healthcare.DicomIOReadIT > testDicomMetadataRead
FAILED
java.nio.file.NoSuchFileException at DicomIOReadIT.java:55
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testSequentialWrite FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testFailFast FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
42 tests completed, 8 failed
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
> FAILED
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerKmsIntegrationTest
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:io:google-cloud-platform:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/io/google-cloud-platform/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformLegacyWorkerIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 28m 37s
217 actionable tasks: 160 executed, 57 from cache
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/qfwba4vdg6hze
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]