See
<https://ci-beam.apache.org/job/beam_PostCommit_Java/7016/display/redirect?page=changes>
Changes:
[Udi Meiri] [BEAM-11517] Enable test_file_loads on Dataflow
[artur.khanin] Fix for the case when truststoreLocation and keystoreLocation
are not
[artur.khanin] Fix nullable coders casting
------------------------------------------
[...truncated 58.21 KB...]
> Task :sdks:java:extensions:google-cloud-platform-core:compileTestJava
> FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:testClasses UP-TO-DATE
> Task :runners:core-java:compileTestJava FROM-CACHE
> Task :runners:core-java:testClasses UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:testJar
> Task :runners:core-java:testJar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:java-job-service:compileJava FROM-CACHE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:extensions:ml:compileJava FROM-CACHE
> Task :sdks:java:extensions:ml:classes UP-TO-DATE
> Task :sdks:java:extensions:ml:jar
> Task :sdks:java:io:kafka:jar
> Task :runners:direct-java:shadowJar FROM-CACHE
> Task :sdks:java:extensions:ml:compileTestJava FROM-CACHE
> Task :sdks:java:extensions:ml:testClasses
> Task :sdks:java:io:kafka:compileTestJava FROM-CACHE
> Task :sdks:java:io:kafka:testClasses UP-TO-DATE
> Task :sdks:java:extensions:sorter:hadoopVersion2101Test FROM-CACHE
> Task :sdks:java:io:parquet:hadoopVersion2101Test FROM-CACHE
> Task :sdks:java:extensions:sorter:hadoopVersion285Test FROM-CACHE
> Task :sdks:java:io:hcatalog:hadoopVersion2101Test FROM-CACHE
> Task :sdks:java:extensions:sorter:hadoopVersion292Test FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:parquet:hadoopVersion285Test FROM-CACHE
> Task :sdks:java:io:hcatalog:hadoopVersion285Test FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:spark:compileJava FROM-CACHE
> Task :runners:spark:classes
> Task :sdks:java:io:hcatalog:hadoopVersion292Test FROM-CACHE
> Task :sdks:java:io:hcatalog:hadoopVersionsTest UP-TO-DATE
> Task :sdks:java:io:parquet:hadoopVersion292Test FROM-CACHE
> Task :sdks:java:extensions:sorter:hadoopVersion321Test FROM-CACHE
> Task :sdks:java:extensions:sorter:hadoopVersionsTest UP-TO-DATE
> Task :sdks:java:io:parquet:hadoopVersion321Test FROM-CACHE
> Task :sdks:java:io:parquet:hadoopVersionsTest UP-TO-DATE
> Task :runners:spark:compileTestJava FROM-CACHE
> Task :runners:spark:testClasses
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:extensions:zetasketch:compileTestJava FROM-CACHE
> Task :sdks:java:extensions:zetasketch:testClasses UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:integrationTestKms
> Task :sdks:java:extensions:google-cloud-platform-core:postCommit
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:testClasses
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava
> FROM-CACHE
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes
> UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:testJar
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
> FROM-CACHE
> Task
> :runners:google-cloud-dataflow-java:coreSDKJavaLegacyWorkerIntegrationTest
> NO-SOURCE
> Task :runners:spark:hadoopVersion2101Test FROM-CACHE
> Task :runners:spark:hadoopVersion285Test FROM-CACHE
> Task :runners:spark:hadoopVersion292Test FROM-CACHE
Unable to watch the file system for changes. The inotify watches limit is too
low.
> Task :runners:spark:hadoopVersion321Test FROM-CACHE
> Task :sdks:java:io:hadoop-file-system:hadoopVersion2101Test
> Task :sdks:java:io:kinesis:integrationTest
> Task :sdks:java:io:kafka:kafkaVersion01103BatchIT
> Task :examples:java:compileJava
> Task :sdks:java:extensions:ml:integrationTest
> Task :sdks:java:io:google-cloud-platform:integrationTest
> Task :sdks:java:extensions:zetasketch:integrationTest
> Task :runners:spark:hadoopVersionsTest UP-TO-DATE
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest
> FROM-CACHE
> Task
> :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerKmsIntegrationTest
> FROM-CACHE
> Task :sdks:java:extensions:ml:postCommit
> Task :sdks:java:io:kafka:kafkaVersion01103Test
> Task :examples:java:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :examples:java:classes
> Task :examples:java:jar
> Task :examples:java:compileTestJava FROM-CACHE
> Task :examples:java:testClasses
> Task :examples:java:testJar
> Task :sdks:java:io:hadoop-format:compileTestJava FROM-CACHE
> Task :sdks:java:io:hadoop-format:testClasses
> Task :sdks:java:io:hadoop-file-system:hadoopVersion285Test
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion100BatchIT
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.healthcare.FhirIOReadIT > testFhirIORead[DSTU2]
FAILED
com.google.api.gax.rpc.InvalidArgumentException at
ApiExceptionFactory.java:49
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.healthcare.FhirIOReadIT > testFhirIORead[STU3] FAILED
com.google.api.gax.rpc.InvalidArgumentException at
ApiExceptionFactory.java:49
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.healthcare.FhirIOReadIT > testFhirIORead[R4] FAILED
com.google.api.gax.rpc.InvalidArgumentException at
ApiExceptionFactory.java:49
Caused by: io.grpc.StatusRuntimeException at Status.java:533
> Task :sdks:java:io:kafka:kafkaVersion100Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersion292Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101IT
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.pubsub.PubsubReadIT > testReadPubsubMessageId FAILED
com.google.api.gax.rpc.InvalidArgumentException at
ApiExceptionFactory.java:49
Caused by: io.grpc.StatusRuntimeException at Status.java:533
org.apache.beam.sdk.io.gcp.pubsub.PubsubReadIT > testReadPublicData FAILED
com.google.api.gax.rpc.InvalidArgumentException at
ApiExceptionFactory.java:49
Caused by: io.grpc.StatusRuntimeException at Status.java:533
> Task :sdks:java:io:kafka:kafkaVersion111BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersion321Test
> Task :sdks:java:io:kafka:kafkaVersion111Test
> Task :sdks:java:io:hadoop-file-system:hadoopVersionsTest
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1228180528-6a81b8d4
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1228180528-6a81b8d4
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_routes_1609178713513.traffic_routes_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1228180528-6a81b8d4
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1228180528-6a81b8d4
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_routes_1609178713513.traffic_routes_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :sdks:java:io:kafka:kafkaVersion201BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion201Test
> Task :sdks:java:io:google-cloud-platform:integrationTest
org.apache.beam.sdk.io.gcp.healthcare.FhirIOSearchIT > testFhirIOSearch[R4]
FAILED
org.apache.beam.sdk.Pipeline$PipelineExecutionException at
FhirIOSearchIT.java:154
Caused by: java.lang.OutOfMemoryError at Arrays.java:3332
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285IT
> Task :sdks:java:io:kafka:kafkaVersion211BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO285Test
> Task :sdks:java:io:kafka:kafkaVersion211Test
> Task :sdks:java:io:kafka:kafkaVersion222BatchIT
> Task :sdks:java:io:google-cloud-platform:integrationTest
42 tests completed, 6 failed, 1 skipped
> Task :sdks:java:io:google-cloud-platform:integrationTest FAILED
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292ElasticIT
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
**********************Set Up Pubsub************************
The Pub/Sub topic has been set up for this example:
projects/apache-beam-testing/topics/testpipeline-jenkins-1228181039-a6faa18b
The Pub/Sub subscription has been set up for this example:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1228181039-a6faa18b
******************Set Up Big Query Table*******************
The BigQuery table has been set up for this example:
apache-beam-testing:traffic_max_lane_flow_1609179038390.traffic_max_lane_flow_table
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1228181039-a6faa18b
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1228181039-a6faa18b
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:traffic_max_lane_flow_1609179038390.traffic_max_lane_flow_table
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :sdks:java:io:kafka:kafkaVersion222Test
> Task :sdks:java:io:google-cloud-platform:integrationTestKms
> Task :sdks:java:io:kafka:kafkaVersion231BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292IT
> Task :sdks:java:io:kafka:kafkaVersion231Test
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO292Test
> Task :sdks:java:io:kafka:kafkaVersion241BatchIT
> Task :sdks:java:io:kafka:kafkaVersion241Test
> Task :sdks:java:io:kafka:kafkaVersion251BatchIT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321ElasticIT
> Task :sdks:java:io:kafka:kafkaVersion251Test
> Task :sdks:java:io:kafka:kafkaVersionsCompatibilityTest
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321IT
> Task :sdks:java:io:hadoop-format:runHadoopFormatIO321Test
> Task :sdks:java:io:hadoop-format:hadoopVersionsTest
> Task :javaHadoopVersionsTest
> Task :sdks:java:extensions:zetasketch:postCommit
> Task
> :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest
***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted:
projects/apache-beam-testing/topics/testpipeline-jenkins-1228182324-93767f
The Pub/Sub subscription has been deleted:
projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1228182324-93767f
The BigQuery table might contain the example's output, and it is not deleted
automatically:
apache-beam-testing:beam_examples.testpipeline_jenkins_1228182324_93767f
Please go to the Developers Console to delete it manually. Otherwise, you may
be charged for its usage.
***********************************************************
***********************************************************
> Task :runners:google-cloud-dataflow-java:postCommit
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:google-cloud-platform:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/io/google-cloud-platform/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 27m 9s
220 actionable tasks: 132 executed, 88 from cache
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/3ll4vgtwcblmc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]