See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/513/display/redirect?page=changes>
Changes: [noreply] Bump loader-utils [chamikaramj] Updates Multi-lang Java quickstart [bulat.safiullin] [Website] change headers size from h4,h3 to h2 #24082 [Kenneth Knowles] Fix checkArgument format string in AvroIO [Kenneth Knowles] Fix checkArgument format in GcsPath [Kenneth Knowles] Remove extraneous jetbrains annotation [bulat.safiullin] [Website] update pre tag copy link styles #23064 [noreply] Bump golang.org/x/net from 0.1.0 to 0.2.0 in /sdks (#24153) [noreply] Make MonotonicWatermarkEstimator work like its Java SDK equivalent [noreply] Test Dataproc 2.1 with Flink load tests (#24129) [noreply] Change DataflowBatchWorkerHarness doWork error level to INFO (#24135) [noreply] Bump github.com/aws/aws-sdk-go-v2/config from 1.17.10 to 1.18.0 in /sdks [noreply] [Tour Of Beam] verify that unit exists when saving progress (#24118) [noreply] Cleanup stale BQ datasets (#24158) [noreply] Support SqlTypes Date and Timestamp (MicrosInstant) in AvroUtils [noreply] Add more tests for S3 filesystem (#24138) [noreply] Merge pull request #23333: Track time on Cloud Dataflow streaming data [Robert Bradshaw] Rename the test_splits flag to direct_test_splits. [noreply] Adding a quickstart to README for the TS SDK (#23509) [noreply] More dataset templates to clean up (#24162) [noreply] Implement embedded WebAssembly example (#24081) [noreply] [Dockerized Jenkins] Update README how to use local repo (#24055) [noreply] [Dockerized Jenkins] Fix build of dockerized jenkins (fixes #24053) [noreply] Bump github.com/aws/aws-sdk-go-v2/feature/s3/manager in /sdks (#24131) ------------------------------------------ [...truncated 57.02 KB...] > Task :sdks:java:container:goPrepare UP-TO-DATE > Task :sdks:java:container:java11:copyJavaThirdPartyLicenses > Task :sdks:java:container:goBuild /home/jenkins/go/bin/go1.18.1 build -o ./build/target/linux_amd64/boot boot.go boot_test.go > Task :sdks:java:container:java11:copySdkHarnessLauncher Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons: - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency for more details about this problem. - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency for more details about this problem. > Task :sdks:java:container:java11:dockerPrepare > Task :sdks:java:container:java11:docker > Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] 1e79c241b64e: Preparing 2e6b8d02cb07: Preparing 4e1c8d34ab49: Preparing 4d106264395d: Preparing 1d05b53780cf: Preparing 4ab11f5be493: Preparing 79cef213581d: Preparing 60682de2ff6e: Preparing c9ebc4a93797: Preparing 7948f6f2c971: Preparing 0c3e9d1230bf: Preparing 1a637508ea3c: Preparing ca01f72cda15: Preparing 871276377415: Preparing 4146a0af6d6f: Preparing 4ab11f5be493: Waiting 60682de2ff6e: Waiting 7b7f3078e1db: Preparing 826c3ddbb29c: Preparing b626401ef603: Preparing 9b55156abf26: Preparing 826c3ddbb29c: Waiting 7948f6f2c971: Waiting 293d5db30c9f: Preparing 03127cdb479b: Preparing 9c742cd6c7a5: Preparing 9b55156abf26: Waiting 03127cdb479b: Waiting 293d5db30c9f: Waiting 7b7f3078e1db: Waiting ca01f72cda15: Waiting 871276377415: Waiting 0c3e9d1230bf: Waiting 79cef213581d: Waiting 1a637508ea3c: Waiting 2e6b8d02cb07: Pushed 4e1c8d34ab49: Pushed 4d106264395d: Pushed 1d05b53780cf: Pushed 1e79c241b64e: Pushed 79cef213581d: Pushed 60682de2ff6e: Pushed 7948f6f2c971: Pushed 1a637508ea3c: Pushed 0c3e9d1230bf: Pushed 4ab11f5be493: Pushed 7b7f3078e1db: Layer already exists 826c3ddbb29c: Layer already exists b626401ef603: Layer already exists 9b55156abf26: Layer already exists 293d5db30c9f: Layer already exists 03127cdb479b: Layer already exists 9c742cd6c7a5: Layer already exists 871276377415: Pushed 4146a0af6d6f: Pushed c9ebc4a93797: Pushed ca01f72cda15: Pushed 20221115123735: digest: sha256:f6f12fd7e106c99536dd2c32d7a475121c05685c0ce69e373916fbe5473eedd8 size: 4935 > Task :sdks:java:testing:load-tests:run Nov 15, 2022 12:40:57 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Nov 15, 2022 12:40:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 229 files. Enable logging at DEBUG level to see which files will be staged. Nov 15, 2022 12:40:59 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Nov 15, 2022 12:40:59 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Nov 15, 2022 12:41:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 229 files from PipelineOptions.filesToStage to staging location to prepare for execution. Nov 15, 2022 12:41:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 229 files cached, 0 files newly uploaded in 0 seconds Nov 15, 2022 12:41:02 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Nov 15, 2022 12:41:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <117744 bytes, hash 3e5a7af6954f0a54ccd686dbd3d67bba94499ce8588196333113bec5debe3813> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Plp69pVPClTM1obb09Z7upRJnOhYgZYzMRO-xd6-OBM.pb Nov 15, 2022 12:41:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Nov 15, 2022 12:41:04 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}] Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s4 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s5 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 1 as step s6 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 2 as step s7 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 3 as step s8 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 4 as step s9 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 5 as step s10 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 6 as step s11 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 7 as step s12 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 8 as step s13 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 9 as step s14 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s15 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.44.0-SNAPSHOT Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-15_04_41_05-6487637338997428606?project=apache-beam-testing Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-11-15_04_41_05-6487637338997428606 Nov 15, 2022 12:41:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-11-15_04_41_05-6487637338997428606 Nov 15, 2022 12:41:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-11-15T12:41:11.954Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo01-jenkins-11-iglh. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Nov 15, 2022 12:41:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:23.743Z: Worker configuration: e2-standard-2 in us-central1-b. Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.152Z: Expanding SplittableParDo operations into optimizable parts. Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.681Z: Expanding CollectionToSingleton operations into optimizable parts. Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.752Z: Expanding CoGroupByKey operations into optimizable parts. Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.791Z: Expanding SplittableProcessKeyed operations into optimizable parts. Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.823Z: Expanding GroupByKey operations into streaming Read/Write steps Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.847Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.907Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.946Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.971Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:30.993Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.015Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.041Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.067Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.093Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.118Z: Fusing consumer Step: 1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.143Z: Fusing consumer Step: 2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.175Z: Fusing consumer Step: 3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.208Z: Fusing consumer Step: 4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.238Z: Fusing consumer Step: 5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.265Z: Fusing consumer Step: 6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.303Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.329Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.354Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.376Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation) Nov 15, 2022 12:41:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:31.497Z: Running job using Streaming Engine Nov 15, 2022 12:41:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:32.719Z: Starting 5 ****s in us-central1-b... Nov 15, 2022 12:41:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:33.971Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Nov 15, 2022 12:42:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:42:17.257Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. > Task :sdks:java:testing:load-tests:run FAILED The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=9b1980e6-a98a-459e-a39d-2117d36bcc4b, currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 383907 log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-383907.out.log ----- Last 20 lines from daemon log file - daemon-383907.out.log ----- Nov 15, 2022 12:41:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:41:33.971Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Nov 15, 2022 12:42:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-11-15T12:42:17.257Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. Remove shutdown hook failed java.lang.IllegalStateException: Shutdown in progress at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82) at java.lang.Runtime.removeShutdownHook(Runtime.java:231) at org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:38) at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208) at org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:366) at org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:108) at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:84) at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64) at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
