See <https://ci-beam.apache.org/job/beam_PostCommit_Java/7022/display/redirect?page=changes>
Changes: [Rui Wang] BEAM-11536. Test "beam:window_fn:serialized_java:v1" in [noreply] [BEAM-11482] Thrift support for KafkaTableProvider (#13572) [noreply] [BEAM-10986] Fix for update shadow jar plugin. (#13586) ------------------------------------------ [...truncated 75.25 KB...] Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:kafka:testClasses > Task :sdks:java:extensions:sorter:hadoopVersion2101Test > Task :sdks:java:io:hadoop-file-system:hadoopVersion2101Test > Task :sdks:java:io:hcatalog:hadoopVersion2101Test > Task :sdks:java:io:parquet:hadoopVersion2101Test > Task :sdks:java:io:kinesis:integrationTest > Task :sdks:java:io:kafka:kafkaVersion01103BatchIT > Task :runners:spark:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:spark:classes > Task :sdks:java:extensions:sorter:hadoopVersion285Test > Task :sdks:java:io:parquet:hadoopVersion285Test > Task :runners:spark:compileTestJava > Task :sdks:java:io:kafka:kafkaVersion01103Test > Task :sdks:java:io:hadoop-file-system:hadoopVersion285Test > Task :runners:spark:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:extensions:sorter:hadoopVersion292Test > Task :sdks:java:io:parquet:hadoopVersion292Test > Task :runners:spark:testClasses > Task :sdks:java:io:google-cloud-platform:classes > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:io:kafka:kafkaVersion100BatchIT > Task :runners:spark:hadoopVersion2101Test > Task :examples:java:compileJava > Task :runners:google-cloud-dataflow-java:compileJava > Task :sdks:java:io:google-cloud-platform:compileTestJava > Task :sdks:java:extensions:zetasketch:compileTestJava > Task :sdks:java:io:hcatalog:hadoopVersion285Test > Task :sdks:java:extensions:zetasketch:compileTestJava Note: <https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/extensions/zetasketch/src/test/java/org/apache/beam/sdk/extensions/zetasketch/HllCountTest.java> uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :sdks:java:extensions:zetasketch:testClasses > Task :sdks:java:extensions:sorter:hadoopVersion321Test > Task :sdks:java:io:parquet:hadoopVersion321Test > Task :sdks:java:io:kafka:kafkaVersion100Test > Task :sdks:java:io:hadoop-file-system:hadoopVersion292Test > Task :sdks:java:io:parquet:hadoopVersionsTest > Task :sdks:java:extensions:sorter:hadoopVersionsTest > Task :examples:java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :examples:java:classes > Task :examples:java:jar > Task :examples:java:compileTestJava > Task :runners:google-cloud-dataflow-java:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :sdks:java:io:hadoop-file-system:hadoopVersion321Test > Task :sdks:java:io:kafka:kafkaVersion111BatchIT > Task :examples:java:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:extensions:ml:integrationTest > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava > Task :examples:java:testClasses > Task :examples:java:testJar > Task :sdks:java:extensions:ml:postCommit > Task :sdks:java:io:hadoop-format:compileTestJava > Task :sdks:java:io:hcatalog:hadoopVersion292Test > Task :sdks:java:io:kafka:kafkaVersion111Test > Task :sdks:java:io:hadoop-file-system:hadoopVersionsTest > Task :sdks:java:io:hadoop-format:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. > Task :sdks:java:io:hadoop-format:testClasses > Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101ElasticIT > Task :sdks:java:io:kafka:kafkaVersion201BatchIT > Task :runners:spark:hadoopVersion285Test > Task :sdks:java:io:google-cloud-platform:compileTestJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:io:google-cloud-platform:testClasses > Task :sdks:java:io:google-cloud-platform:testJar > Task :sdks:java:io:kafka:kafkaVersion201Test > Task :sdks:java:io:google-cloud-platform:integrationTest > Task :sdks:java:io:hcatalog:hadoopVersionsTest > Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101IT > Task :sdks:java:io:kafka:kafkaVersion211BatchIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO2101Test > Task :sdks:java:io:kafka:kafkaVersion211Test > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes > Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar > FROM-CACHE > Task > :runners:google-cloud-dataflow-java:coreSDKJavaLegacyWorkerIntegrationTest > NO-SOURCE > Task :sdks:java:extensions:zetasketch:integrationTest > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest > Task :sdks:java:io:kafka:kafkaVersion222BatchIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO285ElasticIT > Task :runners:spark:hadoopVersion292Test > Task :sdks:java:io:kafka:kafkaVersion222Test > Task :sdks:java:io:hadoop-format:runHadoopFormatIO285IT > Task :sdks:java:io:kafka:kafkaVersion231BatchIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO285Test > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest *********************************************************** *********************************************************** **********************Set Up Pubsub************************ The Pub/Sub topic has been set up for this example: projects/apache-beam-testing/topics/testpipeline-jenkins-1230063747-bb5b3760 The Pub/Sub subscription has been set up for this example: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1230063747-bb5b3760 ******************Set Up Big Query Table******************* The BigQuery table has been set up for this example: apache-beam-testing:traffic_routes_1609310249804.traffic_routes_table *************************Tear Down************************* The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/testpipeline-jenkins-1230063747-bb5b3760 The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1230063747-bb5b3760 The BigQuery table might contain the example's output, and it is not deleted automatically: apache-beam-testing:traffic_routes_1609310249804.traffic_routes_table Please go to the Developers Console to delete it manually. Otherwise, you may be charged for its usage. *********************************************************** *********************************************************** > Task :sdks:java:io:kafka:kafkaVersion231Test > Task :sdks:java:io:hadoop-format:runHadoopFormatIO292ElasticIT > Task :sdks:java:io:kafka:kafkaVersion241BatchIT > Task :runners:spark:hadoopVersion321Test > Task :sdks:java:io:google-cloud-platform:integrationTest org.apache.beam.sdk.io.gcp.healthcare.FhirIOSearchIT > testFhirIOSearch[R4] FAILED org.apache.beam.sdk.Pipeline$PipelineExecutionException at FhirIOSearchIT.java:154 Caused by: java.lang.OutOfMemoryError at StringCoding.java:350 > Task :sdks:java:io:kafka:kafkaVersion241Test > Task :sdks:java:io:hadoop-format:runHadoopFormatIO292IT > Task :sdks:java:io:google-cloud-platform:integrationTest 42 tests completed, 1 failed, 1 skipped > Task :sdks:java:io:google-cloud-platform:integrationTest FAILED > Task :sdks:java:io:kafka:kafkaVersion251BatchIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO292Test > Task :sdks:java:io:google-cloud-platform:integrationTestKms > Task :sdks:java:io:kafka:kafkaVersion251Test > Task :runners:spark:hadoopVersionsTest > Task :sdks:java:io:kafka:kafkaVersionsCompatibilityTest > Task :sdks:java:io:hadoop-format:runHadoopFormatIO321ElasticIT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO321IT > Task :sdks:java:io:hadoop-format:runHadoopFormatIO321Test > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest *********************************************************** *********************************************************** **********************Set Up Pubsub************************ The Pub/Sub topic has been set up for this example: projects/apache-beam-testing/topics/testpipeline-jenkins-1230064648-2e18f307 The Pub/Sub subscription has been set up for this example: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1230064648-2e18f307 ******************Set Up Big Query Table******************* The BigQuery table has been set up for this example: apache-beam-testing:traffic_max_lane_flow_1609310807877.traffic_max_lane_flow_table *************************Tear Down************************* The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/testpipeline-jenkins-1230064648-2e18f307 The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1230064648-2e18f307 The BigQuery table might contain the example's output, and it is not deleted automatically: apache-beam-testing:traffic_max_lane_flow_1609310807877.traffic_max_lane_flow_table Please go to the Developers Console to delete it manually. Otherwise, you may be charged for its usage. *********************************************************** *********************************************************** > Task :sdks:java:io:hadoop-format:hadoopVersionsTest > Task :javaHadoopVersionsTest > Task :sdks:java:extensions:zetasketch:postCommit > Task > :runners:google-cloud-dataflow-java:examplesJavaLegacyWorkerIntegrationTest *********************************************************** *********************************************************** *************************Tear Down************************* The Pub/Sub topic has been deleted: projects/apache-beam-testing/topics/testpipeline-jenkins-1230065632-ea84a0e0 The Pub/Sub subscription has been deleted: projects/apache-beam-testing/subscriptions/testpipeline-jenkins-1230065632-ea84a0e0 The BigQuery table might contain the example's output, and it is not deleted automatically: apache-beam-testing:beam_examples.testpipeline_jenkins_1230065632_ea84a0e0 Please go to the Developers Console to delete it manually. Otherwise, you may be charged for its usage. *********************************************************** *********************************************************** > Task > :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest > Task > :runners:google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerKmsIntegrationTest > Task :runners:google-cloud-dataflow-java:postCommit FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:google-cloud-platform:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_PostCommit_Java/ws/src/sdks/java/io/google-cloud-platform/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 52m 11s 220 actionable tasks: 206 executed, 14 from cache Publishing build scan... https://gradle.com/s/yphjla5tnmory Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
