See <https://ci-beam.apache.org/job/beam_PostCommit_Java/7005/display/redirect?page=changes>
Changes: [Boyuan Zhang] Cache UnboundedReader per CheckpointMark in SDF Wrapper DoFn. ------------------------------------------ [...truncated 78.64 KB...] Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:classes > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:io:hadoop-file-system:hadoopVersion285Test > Task :sdks:java:extensions:sorter:hadoopVersion292Test > Task :sdks:java:io:kafka:kafkaVersion01103Test > Task :runners:spark:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:parquet:hadoopVersion292Test > Task :runners:spark:testClasses > Task :examples:java:compileJava > Task :runners:google-cloud-dataflow-java:compileJava > Task :sdks:java:io:google-cloud-platform:compileTestJava > Task :sdks:java:extensions:zetasketch:compileTestJava Note: <https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/extensions/zetasketch/src/test/java/org/apache/beam/sdk/extensions/zetasketch/HllCountTest.java> uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :runners:spark:hadoopVersion2101Test > Task :sdks:java:extensions:zetasketch:testClasses > Task :sdks:java:io:hcatalog:hadoopVersion285Test > Task :sdks:java:extensions:sorter:hadoopVersion321Test > Task :sdks:java:io:parquet:hadoopVersion321Test > Task :sdks:java:io:kafka:kafkaVersion100BatchIT > Task :sdks:java:io:hadoop-file-system:hadoopVersion292Test > Task :examples:java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :examples:java:classes > Task :examples:java:jar > Task :sdks:java:io:parquet:hadoopVersionsTest > Task :sdks:java:extensions:sorter:hadoopVersionsTest > Task :examples:java:compileTestJava > Task :sdks:java:io:kafka:kafkaVersion100Test > Task :runners:google-cloud-dataflow-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava > Task :sdks:java:extensions:ml:integrationTest > Task :examples:java:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :examples:java:testClasses > Task :examples:java:testJar > Task :sdks:java:io:hadoop-file-system:hadoopVersion321Test > Task :sdks:java:extensions:ml:postCommit > Task :sdks:java:io:hadoop-format:compileTestJava > Task :sdks:java:io:kafka:kafkaVersion111BatchIT > Task :sdks:java:io:hcatalog:hadoopVersion292Test > Task :sdks:java:io:hadoop-format:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :sdks:java:io:hadoop-format:testClasses > Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101ElasticIT > Task :sdks:java:io:hadoop-file-system:hadoopVersionsTest > Task :sdks:java:io:kafka:kafkaVersion111Test > Task :sdks:java:io:google-cloud-platform:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:testClasses > Task :sdks:java:io:google-cloud-platform:testJar > Task :sdks:java:io:google-cloud-platform:integrationTest > Task :runners:spark:hadoopVersion285Test > Task :sdks:java:io:kafka:kafkaVersion201BatchIT > Task :sdks:java:io:hcatalog:hadoopVersionsTest > Task :sdks:java:io:kafka:kafkaVersion201Test > Task :sdks:java:io:google-cloud-platform:integrationTest org.apache.beam.sdk.io.gcp.healthcare.FhirIOReadIT > testFhirIORead[DSTU2] FAILED com.google.api.gax.rpc.InvalidArgumentException at ApiExceptionFactory.java:49 Caused by: io.grpc.StatusRuntimeException at Status.java:533 org.apache.beam.sdk.io.gcp.healthcare.FhirIOReadIT > testFhirIORead[STU3] FAILED com.google.api.gax.rpc.InvalidArgumentException at ApiExceptionFactory.java:49 Caused by: io.grpc.StatusRuntimeException at Status.java:533 org.apache.beam.sdk.io.gcp.healthcare.FhirIOReadIT > testFhirIORead[R4] FAILED com.google.api.gax.rpc.InvalidArgumentException at ApiExceptionFactory.java:49 Caused by: io.grpc.StatusRuntimeException at Status.java:533 > Task :sdks:java:io:kafka:kafkaVersion211BatchIT > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:integrationTest org.apache.beam.sdk.io.gcp.pubsub.PubsubReadIT > testReadPubsubMessageId FAILED com.google.api.gax.rpc.InvalidArgumentException at ApiExceptionFactory.java:49 Caused by: io.grpc.StatusRuntimeException at Status.java:533 org.apache.beam.sdk.io.gcp.pubsub.PubsubReadIT > testReadPublicData FAILED com.google.api.gax.rpc.InvalidArgumentException at ApiExceptionFactory.java:49 Caused by: io.grpc.StatusRuntimeException at Status.java:533 > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar > Task > :runners:google-cloud-dataflow-java:coreSDKJavaLegacyWorkerIntegrationTest > NO-SOURCE > Task :sdks:java:io:kafka:kafkaVersion211Test > Task :sdks:java:extensions:zetasketch:integrationTest > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest > Task :runners:spark:hadoopVersion292Test > Task :sdks:java:io:kafka:kafkaVersion222BatchIT > Task :sdks:java:io:kafka:kafkaVersion222Test > Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101IT > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest *********************************************************** *********************************************************** **********************Set Up Pubsub************************ The Pub/Sub topic has been set up for this example: projects/apache-beam-testing/topics/testpipeline-jenkins-1226003524-ee061df6 The Pub/Sub subscription has been set up for this example: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1226003524-ee061df6 ******************Set Up Big Query Table******************* The BigQuery table has been set up for this example: apache-beam-testing:traffic_routes_1608942911774.traffic_routes_table *************************Tear Down************************* The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/testpipeline-jenkins-1226003524-ee061df6 The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1226003524-ee061df6 The BigQuery table might contain the example's output, and it is not deleted automatically: apache-beam-testing:traffic_routes_1608942911774.traffic_routes_table Please go to the Developers Console to delete it manually. Otherwise, you may be charged for its usage. *********************************************************** *********************************************************** > Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101Test > Task :sdks:java:io:kafka:kafkaVersion231BatchIT > Task :sdks:java:io:google-cloud-platform:integrationTest org.apache.beam.sdk.io.gcp.healthcare.FhirIOSearchIT > testFhirIOSearch[R4] FAILED org.apache.beam.sdk.Pipeline$PipelineExecutionException at FhirIOSearchIT.java:154 Caused by: com.google.gson.JsonParseException at JsonParser.java:89 Caused by: java.lang.OutOfMemoryError at JsonObject.java:33 > Task :sdks:java:io:kafka:kafkaVersion231Test > Task :runners:spark:hadoopVersion321Test > Task :sdks:java:io:google-cloud-platform:integrationTest 42 tests completed, 6 failed, 1 skipped > Task :sdks:java:io:google-cloud-platform:integrationTest FAILED > Task :sdks:java:io:kafka:kafkaVersion241BatchIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO285ElasticIT > Task :sdks:java:io:google-cloud-platform:integrationTestKms > Task :sdks:java:io:kafka:kafkaVersion241Test > Task :sdks:java:io:hadoop-format:runHadoopFormatIO285IT > Task :sdks:java:io:kafka:kafkaVersion251BatchIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO285Test > Task :sdks:java:io:kafka:kafkaVersion251Test > Task :runners:spark:hadoopVersionsTest > Task :sdks:java:io:kafka:kafkaVersionsCompatibilityTest > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest *********************************************************** *********************************************************** **********************Set Up Pubsub************************ The Pub/Sub topic has been set up for this example: projects/apache-beam-testing/topics/testpipeline-jenkins-1226004218-8e4b63f The Pub/Sub subscription has been set up for this example: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1226004218-8e4b63f ******************Set Up Big Query Table******************* The BigQuery table has been set up for this example: apache-beam-testing:traffic_max_lane_flow_1608943337444.traffic_max_lane_flow_table *************************Tear Down************************* The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/testpipeline-jenkins-1226004218-8e4b63f The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1226004218-8e4b63f The BigQuery table might contain the example's output, and it is not deleted automatically: apache-beam-testing:traffic_max_lane_flow_1608943337444.traffic_max_lane_flow_table Please go to the Developers Console to delete it manually. Otherwise, you may be charged for its usage. *********************************************************** *********************************************************** > Task :sdks:java:io:hadoop-format:runHadoopFormatIO292ElasticIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO292IT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO292Test > Task :sdks:java:io:hadoop-format:runHadoopFormatIO321ElasticIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO321IT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO321Test > Task :sdks:java:io:hadoop-format:hadoopVersionsTest > Task :javaHadoopVersionsTest > Task :sdks:java:extensions:zetasketch:postCommit > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest *********************************************************** *********************************************************** *************************Tear Down************************* The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/testpipeline-jenkins-1226005417-cfec4bfd The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1226005417-cfec4bfd The BigQuery table might contain the example's output, and it is not deleted automatically: apache-beam-testing:beam_examples.testpipeline_jenkins_1226005417_cfec4bfd Please go to the Developers Console to delete it manually. Otherwise, you may be charged for its usage. *********************************************************** *********************************************************** > Task > :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest > Task > :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerKmsIntegrationTest > Task :runners:google-cloud-dataflow-java:postCommit FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:google-cloud-platform:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/io/google-cloud-platform/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 51m 0s 220 actionable tasks: 201 executed, 19 from cache Publishing build scan... https://gradle.com/s/kbypvkoohwise Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
