See
<https://ci-beam.apache.org/job/beam_PostCommit_Java/6907/display/redirect?page=changes>
Changes:
[piotr.szuberski] [BEAM-11173] Add Bigtable table with read operation
[noreply] Update python versions in pre-requisites (#13451)
------------------------------------------
[...truncated 76.96 KB...]
> Task :sdks:java:io:kafka:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :examples:java:compileTestJava
> Task :sdks:java:extensions:sorter:hadoopVersion292Test
> Task :examples:java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:io:parquet:hadoopVersion285Test
> Task :runners:spark:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :runners:spark:classes
> Task :sdks:java:extensions:sorter:hadoopVersion321Test
> Task :runners:spark:compileTestJava
> Task :examples:java:testClasses
> Task :examples:java:testJar
> Task :sdks:java:extensions:sorter:hadoopVersionsTest
> Task :sdks:java:io:hadoop-file-system:hadoopVersion292Test
> Task :runners:spark:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :sdks:java:io:hadoop-format:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
> Task :sdks:java:io:hadoop-format:testClasses
> Task :sdks:java:io:hcatalog:hadoopVersion285Test
> Task :runners:spark:testClasses
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101ElasticIT
> Task :runners:spark:hadoopVersion2101Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersion321Test
> Task :sdks:java:io:parquet:hadoopVersion292Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersionsTest
> Task :sdks:java:io:kafka:testClasses
> Task :sdks:java:io:parquet:hadoopVersion321Test
> Task :sdks:java:io:parquet:hadoopVersionsTest
> Task :sdks:java:io:kinesis:testClasses
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101IT
> Task :sdks:java:io:kafka:kafkaVersion01103BatchIT
> Task :sdks:java:io:kinesis:integrationTest
> Task :sdks:java:io:hcatalog:hadoopVersion292Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101Test
> Task :sdks:java:io:kafka:kafkaVersion01103Test
> Task :sdks:java:io:hcatalog:hadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion100BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285ElasticIT
> Task :runners:spark:hadoopVersion285Test
> Task :sdks:java:io:kafka:kafkaVersion100Test
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerExceptionFactory.java:210
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
41 tests completed, 4 failed
> Task :sdks:java:io:google-cloud-platform:integrationTest FAILED
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285IT
> Task :sdks:java:io:kafka:kafkaVersion111BatchIT
> Task :sdks:java:io:google-cloud-platform:integrationTestKms
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285Test
> Task :sdks:java:io:kafka:kafkaVersion111Test
> Task :runners:spark:hadoopVersion292Test
> Task :sdks:java:io:kafka:kafkaVersion201BatchIT
> Task :sdks:java:io:kafka:kafkaVersion201Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292ElasticIT
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
org.apache.beam.sdk.io.gcp.healthcare.DicomIOReadIT > testDicomMetadataRead
FAILED
java.nio.file.NoSuchFileException at DicomIOReadIT.java:55
> Task :sdks:java:io:kafka:kafkaVersion211BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292IT
> Task :sdks:java:io:kafka:kafkaVersion211Test
> Task :runners:spark:hadoopVersion321Test
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292Test
> Task :sdks:java:io:kafka:kafkaVersion222BatchIT
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb FAILED
java.util.concurrent.ExecutionException at SpannerReadIT.java:129
Caused by: com.google.cloud.spanner.SpannerException at
SpannerReadIT.java:119
Caused by: io.grpc.StatusRuntimeException at Status.java:533
> Task :sdks:java:io:kafka:kafkaVersion222Test
> Task :sdks:java:extensions:zetasketch:postCommit
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion231BatchIT
> Task :runners:spark:hadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion231Test
> Task :sdks:java:io:kafka:kafkaVersion241BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321IT
> Task :sdks:java:io:kafka:kafkaVersion241Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321Test
> Task :sdks:java:io:kafka:kafkaVersion251BatchIT
> Task :sdks:java:io:hadoop-format:hadoopVersionsTest
> Task :javaHadoopVersionsTest
> Task :sdks:java:io:kafka:kafkaVersion251Test
> Task :sdks:java:io:kafka:kafkaVersionsCompatibilityTest
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testSequentialWrite FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testFailFast FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures FAILED
java.util.concurrent.ExecutionException at SpannerWriteIT.java:135
Caused by: com.google.cloud.spanner.SpannerException at
SpannerWriteIT.java:125
Caused by: io.grpc.StatusRuntimeException at Status.java:533
42 tests completed, 8 failed
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
> FAILED
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1201130005-6b35e60e
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1201130005-6b35e60e
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_routes_1606827601144.traffic_routes_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1201130005-6b35e60e
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1201130005-6b35e60e
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_routes_1606827601144.traffic_routes_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1201130532-8cf30445
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1201130532-8cf30445
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_max_lane_flow_1606827932462.traffic_max_lane_flow_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1201130532-8cf30445
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1201130532-8cf30445
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_max_lane_flow_1606827932462.traffic_max_lane_flow_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1201131846-8785b62c
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1201131846-8785b62c
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:beam_examples.testpipeline_jenkins_1201131846_8785b62c
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerKmsIntegrationTest
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:io:google-cloud-platform:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/io/google-cloud-platform/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformLegacyWorkerIntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 24m 53s
217 actionable tasks: 186 executed, 31 from cache
Publishing build scan...
https://gradle.com/s/jcwgrxzb6x7dy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]