See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1878/display/redirect?page=changes>
Changes: [huangry] Update worker container version to most recent release. ------------------------------------------ [...truncated 54.30 MB...] INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.18/e4a441249ade301985cb8d009d4e4a72b85bf68e/snakeyaml-1.18.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/snakeyaml-1.18-R-oOCdgXK4J6PV-saHyToQ.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.1.2/c4ca32c2be1b22a5553dd3171f51f9b2b04030b/antlr-runtime-3.1.2.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/antlr-runtime-3.1.2-dpdEvr2kPK4dWHj9fUOv1A.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/ognl/ognl/3.1.12/a7fa0db32f882cd3bb41ec6c489853b3bfb6aebc/ognl-3.1.12.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/ognl-3.1.12-aq6HFAi1aHiNX6qcImOsmw.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-tcnative-boringssl-static/2.0.8.Final/5c3483dfa33cd04f5469c95abf67e1b69a8f1221/netty-tcnative-boringssl-static-2.0.8.Final.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/netty-tcnative-boringssl-static-2.0.8.Final-RCm0wU8kBdzNqqi47Z837A.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.beanshell/bsh/2.0b4/a05f0a0feefa8d8467ac80e16e7de071489f0d9c/bsh-2.0b4.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/bsh-2.0b4-ocYKqDycmmyyORwcG4XrAA.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.25.Final/e17d5c05c101fe14536ce3fb34b36c54e04791f6/netty-common-4.1.25.Final.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/netty-common-4.1.25.Final-cYwMY1_F2pzboFUnBL3CxQ.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT.jar> to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-lrT1R8SMeyI_cgvx4NWfww.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testng/testng/6.8.21/15e02d8d7be3c3640b585b97eda56026fdb5bf4d/testng-6.8.21.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/testng-6.8.21-4nE6Pd5l58HFjEGfLzL88Q.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/vendor/java-grpc-v1/build/libs/beam-vendor-java-grpc-v1-2.9.0-SNAPSHOT.jar> to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/beam-vendor-java-grpc-v1-2.9.0-SNAPSHOT-HjrNvMjG9CsQJ1maWDon0w.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-netty-shaded/1.13.1/ccdc4f2c2791d93164c574fbfb90d614aa0849ae/grpc-netty-shaded-1.13.1.jar to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/grpc-netty-shaded-1.13.1-YS-LK7_gIZl0B4Kw_7Rw7A.jar Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.9.0-SNAPSHOT.jar> to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/beam-runners-direct-java-2.9.0-SNAPSHOT-vg9cZryWT8X9iF3ZS4VmSQ.jar Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 0 files cached, 161 files newly uploaded Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding GenerateSequence/Read(BoundedCountingSource) as step s1 Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(CreateEntity) as step s2 Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding DatastoreV1.Write/Convert to Mutation/Map as step s3 Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding DatastoreV1.Write/Write Mutation to Datastore as step s4 Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/ Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <8720 bytes, hash gSstDAFZbLY-R3VlUOr73Q> to gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/pipeline-gSstDAFZbLY-R3VlUOr73Q.pb org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_OUT Dataflow SDK version: 2.9.0-SNAPSHOT org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR Nov 13, 2018 2:01:18 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_18_01_16-16145927481795306072?project=apache-beam-testing org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_OUT Submitted job: 2018-11-12_18_01_16-16145927481795306072 org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR Nov 13, 2018 2:01:18 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2018-11-12_18_01_16-16145927481795306072 Nov 13, 2018 2:01:18 AM org.apache.beam.runners.dataflow.TestDataflowRunner run INFO: Running Dataflow job 2018-11-12_18_01_16-16145927481795306072 with 0 expected assertions. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:17.029Z: Autoscaling is enabled for job 2018-11-12_18_01_16-16145927481795306072. The number of workers will be between 1 and 1000. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:17.225Z: Autoscaling was automatically enabled for job 2018-11-12_18_01_16-16145927481795306072. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:20.390Z: Checking permissions granted to controller Service Account. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:26.771Z: Worker configuration: n1-standard-1 in us-central1-b. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.420Z: Expanding CoGroupByKey operations into optimizable parts. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.479Z: Expanding GroupByKey operations into optimizable parts. Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.538Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.680Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.742Z: Fusing consumer ParDo(CreateEntity) into GenerateSequence/Read(BoundedCountingSource) Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.794Z: Fusing consumer DatastoreV1.Write/Write Mutation to Datastore into DatastoreV1.Write/Convert to Mutation/Map Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:27.858Z: Fusing consumer DatastoreV1.Write/Convert to Mutation/Map into ParDo(CreateEntity) Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:28.643Z: Executing operation GenerateSequence/Read(BoundedCountingSource)+ParDo(CreateEntity)+DatastoreV1.Write/Convert to Mutation/Map+DatastoreV1.Write/Write Mutation to Datastore Nov 13, 2018 2:01:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:28.765Z: Starting 1 workers in us-central1-b... org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb STANDARD_ERROR Nov 13, 2018 2:01:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:40.104Z: Autoscaling: Resized worker pool from 1 to 0. Nov 13, 2018 2:01:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:40.151Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job. Nov 13, 2018 2:01:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:40.224Z: Worker pool stopped. org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR Nov 13, 2018 2:01:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:01:40.590Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s). org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb STANDARD_ERROR Nov 13, 2018 2:01:47 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish INFO: Job 2018-11-12_17_56_50-3750356465867903583 finished with status DONE. Nov 13, 2018 2:01:47 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess INFO: Success result for Dataflow job 2018-11-12_17_56_50-3750356465867903583. Found 1 success, 0 failures out of 1 expected assertions. Gradle Test Executor 162 finished executing tests. > Task > :beam-runners-google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR Nov 13, 2018 2:02:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:02:14.194Z: Workers have started successfully. Nov 13, 2018 2:02:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:02:14.728Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). Nov 13, 2018 2:02:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:02:34.392Z: Cleaning up. Nov 13, 2018 2:02:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:02:34.488Z: Stopping worker pool... Nov 13, 2018 2:04:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:04:34.793Z: Autoscaling: Resized worker pool from 1 to 0. Nov 13, 2018 2:04:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:04:34.826Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job. Nov 13, 2018 2:04:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-11-13T02:04:34.870Z: Worker pool stopped. Nov 13, 2018 2:04:41 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish INFO: Job 2018-11-12_18_01_16-16145927481795306072 finished with status DONE. Nov 13, 2018 2:04:41 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess INFO: Success result for Dataflow job 2018-11-12_18_01_16-16145927481795306072. Found 0 success, 0 failures out of 0 expected assertions. Nov 13, 2018 2:04:43 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch INFO: Writing batch of 500 entities Nov 13, 2018 2:04:43 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch INFO: Successfully wrote 500 entities Nov 13, 2018 2:04:44 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch INFO: Writing batch of 500 entities Nov 13, 2018 2:04:44 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch INFO: Successfully wrote 500 entities Nov 13, 2018 2:04:44 AM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil deleteAllEntities INFO: Successfully deleted 1000 entities Gradle Test Executor 160 finished executing tests. > Task > :beam-runners-google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest > FAILED org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > testNewTypesQueryWithoutReshuffleWithCustom SKIPPED :beam-runners-google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 21 mins 6.746 secs. FAILURE: Build completed with 8 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-examples-java:directRunnerPreCommit'. > Process 'Gradle Test Executor 78' finished with non-zero exit value 1 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-sdks-java-io-hbase:test'. > Process 'Gradle Test Executor 107' finished with non-zero exit value 1 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-6:test'. > Process 'Gradle Test Executor 103' finished with non-zero exit value 1 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 4: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-2:test'. > Process 'Gradle Test Executor 98' finished with non-zero exit value 1 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 5: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-runners-flink_2.11:test'. > Process 'Gradle Test Executor 99' finished with non-zero exit value 134 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 6: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-sdks-java-io-hadoop-file-system:test'. > Process 'Gradle Test Executor 102' finished with non-zero exit value 1 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 7: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-5:test'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/elasticsearch-tests/elasticsearch-tests-5/build/reports/tests/test/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 8: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformLegacyWorkerIntegrationTest'. > Process 'Gradle Test Executor 159' finished with non-zero exit value 1 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the user guide at https://docs.gradle.org/4.10.2/userguide/java_plugin.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 58m 18s 905 actionable tasks: 899 executed, 6 from cache Publishing build scan... https://gradle.com/s/6opqg7hza4cli Closing Git repo: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/.git> Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org