See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/428/display/redirect?page=changes>
Changes: [noreply] Bump cloud.google.com/go/bigquery from 1.37.0 to 1.38.0 in /sdks ------------------------------------------ [...truncated 49.33 KB...] > Task :sdks:java:container:java11:copySdkHarnessLauncher Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons: - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem. - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem. > Task :sdks:java:container:java11:dockerPrepare > Task :sdks:java:container:java11:docker > Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] ea723c0c98c2: Preparing bb3641e438ed: Preparing fba71bccc884: Preparing 886e88580a99: Preparing 8fb17467a6da: Preparing 02b1ffe41213: Preparing 9eee5c54334d: Preparing 2176f7007cd6: Preparing 78d7062e71d0: Preparing a6537206f695: Preparing d43b8b3f63a4: Preparing 604cde58fc0b: Preparing 3a3dc3ac86de: Preparing cf5737ae2990: Preparing 02b1ffe41213: Waiting 97916671a892: Preparing 7b7f3078e1db: Preparing d43b8b3f63a4: Waiting 826c3ddbb29c: Preparing b626401ef603: Preparing 78d7062e71d0: Waiting 9b55156abf26: Preparing 293d5db30c9f: Preparing 03127cdb479b: Preparing 9c742cd6c7a5: Preparing a6537206f695: Waiting b626401ef603: Waiting 7b7f3078e1db: Waiting 3a3dc3ac86de: Waiting 826c3ddbb29c: Waiting 9b55156abf26: Waiting cf5737ae2990: Waiting 03127cdb479b: Waiting 9c742cd6c7a5: Waiting 2176f7007cd6: Waiting 293d5db30c9f: Waiting 604cde58fc0b: Waiting 9eee5c54334d: Waiting fba71bccc884: Pushed bb3641e438ed: Pushed 8fb17467a6da: Pushed 886e88580a99: Pushed 2176f7007cd6: Pushed ea723c0c98c2: Pushed 9eee5c54334d: Pushed a6537206f695: Pushed 604cde58fc0b: Pushed d43b8b3f63a4: Pushed 78d7062e71d0: Pushed 7b7f3078e1db: Layer already exists 826c3ddbb29c: Layer already exists b626401ef603: Layer already exists 9b55156abf26: Layer already exists 02b1ffe41213: Pushed 293d5db30c9f: Layer already exists 9c742cd6c7a5: Layer already exists 03127cdb479b: Layer already exists cf5737ae2990: Pushed 97916671a892: Pushed 3a3dc3ac86de: Pushed 20220822123736: digest: sha256:9be97a8624d6e39d6ca2ba233d6e51db2cb698ded6c955927b1347e7ce1580eb size: 4935 > Task :sdks:java:testing:load-tests:run Aug 22, 2022 12:39:13 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Aug 22, 2022 12:39:14 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged. Aug 22, 2022 12:39:15 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Aug 22, 2022 12:39:15 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Aug 22, 2022 12:39:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution. Aug 22, 2022 12:39:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds Aug 22, 2022 12:39:19 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Aug 22, 2022 12:39:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <115399 bytes, hash c21f4924d86116c7af9263f9e340a6914642742ddede1530b3050608cdfe3d70> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wh9JJNhhFsevkmP540CmkUZCdC3e3hUwswUGCM3-PXA.pb Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Aug 22, 2022 12:39:21 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}] Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s4 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s5 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 1 as step s6 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 2 as step s7 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 3 as step s8 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 4 as step s9 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 5 as step s10 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 6 as step s11 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 7 as step s12 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 8 as step s13 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 9 as step s14 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s15 Aug 22, 2022 12:39:21 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.42.0-SNAPSHOT Aug 22, 2022 12:39:22 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-08-22_05_39_21-7209788374762571218?project=apache-beam-testing Aug 22, 2022 12:39:22 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-08-22_05_39_21-7209788374762571218 Aug 22, 2022 12:39:22 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-08-22_05_39_21-7209788374762571218 Aug 22, 2022 12:39:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-08-22T12:39:25.331Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo01-jenkins-08-okh0. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Aug 22, 2022 12:39:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:30.653Z: Worker configuration: e2-standard-2 in us-central1-a. Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.485Z: Expanding SplittableParDo operations into optimizable parts. Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.545Z: Expanding CollectionToSingleton operations into optimizable parts. Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.601Z: Expanding CoGroupByKey operations into optimizable parts. Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.644Z: Expanding SplittableProcessKeyed operations into optimizable parts. Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.674Z: Expanding GroupByKey operations into streaming Read/Write steps Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.700Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.888Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.931Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.964Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:31.987Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.018Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.051Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.075Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.100Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.123Z: Fusing consumer Step: 1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.149Z: Fusing consumer Step: 2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.170Z: Fusing consumer Step: 3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.203Z: Fusing consumer Step: 4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.227Z: Fusing consumer Step: 5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.262Z: Fusing consumer Step: 6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.337Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.361Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.390Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.416Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.514Z: Running job using Streaming Engine Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.770Z: Starting 5 ****s in us-central1-a... Aug 22, 2022 12:39:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:43.050Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Aug 22, 2022 12:40:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:40:10.440Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate. Aug 22, 2022 12:40:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:40:10.471Z: Resized **** pool to 3, though goal was 5. This could be a quota issue. Aug 22, 2022 12:40:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:40:20.695Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. > Task :sdks:java:testing:load-tests:run FAILED The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=b7133293-829c-4f7f-9bfc-e75989b33727, currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 29574 log file: /home/jenkins/.gradle/daemon/7.4/daemon-29574.out.log ----- Last 20 lines from daemon log file - daemon-29574.out.log ----- INFO: 2022-08-22T12:39:32.337Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.361Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.390Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.416Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation) Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.514Z: Running job using Streaming Engine Aug 22, 2022 12:39:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:32.770Z: Starting 5 ****s in us-central1-a... Aug 22, 2022 12:39:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:39:43.050Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Aug 22, 2022 12:40:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:40:10.440Z: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate. Aug 22, 2022 12:40:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:40:10.471Z: Resized **** pool to 3, though goal was 5. This could be a quota issue. Aug 22, 2022 12:40:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-22T12:40:20.695Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
