See <https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17/181/display/redirect?page=changes>
Changes: [Pablo Estrada] Removing playground from main page to remove scrolling issue [noreply] Merge pull request #21940 from [21941] Fix no output timestamp case ------------------------------------------ [...truncated 224.21 KB...] As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] f93b2e1107d5: Preparing 0a08e168ff0c: Preparing e8ae8b809c28: Preparing 8e66af0f96dc: Preparing 8b0d6e08109a: Preparing 6a75abdd5e94: Preparing ed6cd242917c: Preparing 6f2ee9dd87dc: Preparing 7c129f0bb327: Preparing 622a3e12697e: Preparing c439eefeeea7: Preparing 92b253562868: Preparing 644feabcd9e5: Preparing 809db638d899: Preparing e0ea623edd2a: Preparing 3bc383470c05: Preparing e93827457889: Preparing 08fa02ce37eb: Preparing 6a75abdd5e94: Waiting a037458de4e0: Preparing bafdbe68e4ae: Preparing a13c519c6361: Preparing ed6cd242917c: Waiting 6f2ee9dd87dc: Waiting 7c129f0bb327: Waiting 622a3e12697e: Waiting c439eefeeea7: Waiting 92b253562868: Waiting 08fa02ce37eb: Waiting a037458de4e0: Waiting 644feabcd9e5: Waiting bafdbe68e4ae: Waiting e93827457889: Waiting 3bc383470c05: Waiting 809db638d899: Waiting 8b0d6e08109a: Pushed 8e66af0f96dc: Pushed e8ae8b809c28: Pushed 0a08e168ff0c: Pushed f93b2e1107d5: Pushed ed6cd242917c: Pushed 6f2ee9dd87dc: Pushed 622a3e12697e: Pushed 7c129f0bb327: Pushed 6a75abdd5e94: Pushed 92b253562868: Pushed 3bc383470c05: Layer already exists e93827457889: Layer already exists 08fa02ce37eb: Layer already exists a037458de4e0: Layer already exists bafdbe68e4ae: Layer already exists a13c519c6361: Layer already exists c439eefeeea7: Pushed e0ea623edd2a: Pushed 809db638d899: Pushed 644feabcd9e5: Pushed 20220619143306: digest: sha256:1dfb3dcfa354c7f971c4dd7d1c8225975103b2d39bedffefb160fa6a289d290d size: 4729 > Task :sdks:java:testing:load-tests:run Jun 19, 2022 2:33:36 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Jun 19, 2022 2:33:37 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged. Jun 19, 2022 2:33:37 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Jun 19, 2022 2:33:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution. Jun 19, 2022 2:33:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds Jun 19, 2022 2:33:40 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Jun 19, 2022 2:33:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <112190 bytes, hash c8ab176a26fb00362045c616da36c7c644a57d096b9130777ac2f529c859fb80> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yKsXaib7ADYgRcYW2jbHxkSlfQlrkTB3esL1KchZ-4A.pb Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Jun 19, 2022 2:33:42 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 64 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=78125}, SyntheticUnboundedSource{startOffset=78125, endOffset=156250}, SyntheticUnboundedSource{startOffset=156250, endOffset=234375}, SyntheticUnboundedSource{startOffset=234375, endOffset=312500}, SyntheticUnboundedSource{startOffset=312500, endOffset=390625}, SyntheticUnboundedSource{startOffset=390625, endOffset=468750}, SyntheticUnboundedSource{startOffset=468750, endOffset=546875}, SyntheticUnboundedSource{startOffset=546875, endOffset=625000}, SyntheticUnboundedSource{startOffset=625000, endOffset=703125}, SyntheticUnboundedSource{startOffset=703125, endOffset=781250}, SyntheticUnboundedSource{startOffset=781250, endOffset=859375}, SyntheticUnboundedSource{startOffset=859375, endOffset=937500}, SyntheticUnboundedSource{startOffset=937500, endOffset=1015625}, SyntheticUnboundedSource{startOffset=1015625, endOffset=1093750}, SyntheticUnboundedSource{startOffset=1093750, endOffset=1171875}, SyntheticUnboundedSource{startOffset=1171875, endOffset=1250000}, SyntheticUnboundedSource{startOffset=1250000, endOffset=1328125}, SyntheticUnboundedSource{startOffset=1328125, endOffset=1406250}, SyntheticUnboundedSource{startOffset=1406250, endOffset=1484375}, SyntheticUnboundedSource{startOffset=1484375, endOffset=1562500}, SyntheticUnboundedSource{startOffset=1562500, endOffset=1640625}, SyntheticUnboundedSource{startOffset=1640625, endOffset=1718750}, SyntheticUnboundedSource{startOffset=1718750, endOffset=1796875}, SyntheticUnboundedSource{startOffset=1796875, endOffset=1875000}, SyntheticUnboundedSource{startOffset=1875000, endOffset=1953125}, SyntheticUnboundedSource{startOffset=1953125, endOffset=2031250}, SyntheticUnboundedSource{startOffset=2031250, endOffset=2109375}, SyntheticUnboundedSource{startOffset=2109375, endOffset=2187500}, SyntheticUnboundedSource{startOffset=2187500, endOffset=2265625}, SyntheticUnboundedSource{startOffset=2265625, endOffset=2343750}, SyntheticUnboundedSource{startOffset=2343750, endOffset=2421875}, SyntheticUnboundedSource{startOffset=2421875, endOffset=2500000}, SyntheticUnboundedSource{startOffset=2500000, endOffset=2578125}, SyntheticUnboundedSource{startOffset=2578125, endOffset=2656250}, SyntheticUnboundedSource{startOffset=2656250, endOffset=2734375}, SyntheticUnboundedSource{startOffset=2734375, endOffset=2812500}, SyntheticUnboundedSource{startOffset=2812500, endOffset=2890625}, SyntheticUnboundedSource{startOffset=2890625, endOffset=2968750}, SyntheticUnboundedSource{startOffset=2968750, endOffset=3046875}, SyntheticUnboundedSource{startOffset=3046875, endOffset=3125000}, SyntheticUnboundedSource{startOffset=3125000, endOffset=3203125}, SyntheticUnboundedSource{startOffset=3203125, endOffset=3281250}, SyntheticUnboundedSource{startOffset=3281250, endOffset=3359375}, SyntheticUnboundedSource{startOffset=3359375, endOffset=3437500}, SyntheticUnboundedSource{startOffset=3437500, endOffset=3515625}, SyntheticUnboundedSource{startOffset=3515625, endOffset=3593750}, SyntheticUnboundedSource{startOffset=3593750, endOffset=3671875}, SyntheticUnboundedSource{startOffset=3671875, endOffset=3750000}, SyntheticUnboundedSource{startOffset=3750000, endOffset=3828125}, SyntheticUnboundedSource{startOffset=3828125, endOffset=3906250}, SyntheticUnboundedSource{startOffset=3906250, endOffset=3984375}, SyntheticUnboundedSource{startOffset=3984375, endOffset=4062500}, SyntheticUnboundedSource{startOffset=4062500, endOffset=4140625}, SyntheticUnboundedSource{startOffset=4140625, endOffset=4218750}, SyntheticUnboundedSource{startOffset=4218750, endOffset=4296875}, SyntheticUnboundedSource{startOffset=4296875, endOffset=4375000}, SyntheticUnboundedSource{startOffset=4375000, endOffset=4453125}, SyntheticUnboundedSource{startOffset=4453125, endOffset=4531250}, SyntheticUnboundedSource{startOffset=4531250, endOffset=4609375}, SyntheticUnboundedSource{startOffset=4609375, endOffset=4687500}, SyntheticUnboundedSource{startOffset=4687500, endOffset=4765625}, SyntheticUnboundedSource{startOffset=4765625, endOffset=4843750}, SyntheticUnboundedSource{startOffset=4843750, endOffset=4921875}, SyntheticUnboundedSource{startOffset=4921875, endOffset=5000000}] Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics as step s3 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Total bytes monitor as step s4 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s5 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (0) as step s6 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (0) as step s7 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (0) as step s8 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (1) as step s9 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (1) as step s10 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (1) as step s11 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (2) as step s12 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (2) as step s13 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (2) as step s14 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Group by key (3) as step s15 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate (3) as step s16 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics (3) as step s17 Jun 19, 2022 2:33:42 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.41.0-SNAPSHOT Jun 19, 2022 2:33:44 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-06-19_07_33_42-10882084210239372597?project=apache-beam-testing Jun 19, 2022 2:33:44 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-06-19_07_33_42-10882084210239372597 Jun 19, 2022 2:33:44 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-06-19_07_33_42-10882084210239372597 Jun 19, 2022 2:33:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-06-19T14:33:49.589Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java170dataflow0v20streaming0gbk04-jenkins-0619-vb71. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Jun 19, 2022 2:33:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:56.986Z: Worker configuration: e2-standard-2 in us-central1-b. Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:57.969Z: Expanding SplittableParDo operations into optimizable parts. Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.003Z: Expanding CollectionToSingleton operations into optimizable parts. Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.070Z: Expanding CoGroupByKey operations into optimizable parts. Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.232Z: Expanding SplittableProcessKeyed operations into optimizable parts. Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.294Z: Expanding GroupByKey operations into streaming Read/Write steps Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.431Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.669Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.706Z: Fusing consumer Group by key (0)/WriteStream into Window.Into()/Window.Assign Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.725Z: Fusing consumer Group by key (1)/WriteStream into Window.Into()/Window.Assign Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.747Z: Fusing consumer Group by key (2)/WriteStream into Window.Into()/Window.Assign Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.772Z: Fusing consumer Group by key (3)/WriteStream into Window.Into()/Window.Assign Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.809Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.843Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.878Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.910Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.936Z: Fusing consumer Collect start time metrics/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.972Z: Fusing consumer Total bytes monitor/ParMultiDo(ByteMonitor) into Collect start time metrics/ParMultiDo(TimeMonitor) Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:58.997Z: Fusing consumer Window.Into()/Window.Assign into Total bytes monitor/ParMultiDo(ByteMonitor) Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.031Z: Fusing consumer Group by key (0)/MergeBuckets into Group by key (0)/ReadStream Jun 19, 2022 2:33:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.064Z: Fusing consumer Ungroup and reiterate (0)/ParMultiDo(UngroupAndReiterate) into Group by key (0)/MergeBuckets Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.131Z: Fusing consumer Collect end time metrics (0)/ParMultiDo(TimeMonitor) into Ungroup and reiterate (0)/ParMultiDo(UngroupAndReiterate) Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.158Z: Fusing consumer Group by key (1)/MergeBuckets into Group by key (1)/ReadStream Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.194Z: Fusing consumer Ungroup and reiterate (1)/ParMultiDo(UngroupAndReiterate) into Group by key (1)/MergeBuckets Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.227Z: Fusing consumer Collect end time metrics (1)/ParMultiDo(TimeMonitor) into Ungroup and reiterate (1)/ParMultiDo(UngroupAndReiterate) Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.290Z: Fusing consumer Group by key (2)/MergeBuckets into Group by key (2)/ReadStream Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.312Z: Fusing consumer Ungroup and reiterate (2)/ParMultiDo(UngroupAndReiterate) into Group by key (2)/MergeBuckets Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.340Z: Fusing consumer Collect end time metrics (2)/ParMultiDo(TimeMonitor) into Ungroup and reiterate (2)/ParMultiDo(UngroupAndReiterate) Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.369Z: Fusing consumer Group by key (3)/MergeBuckets into Group by key (3)/ReadStream Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.392Z: Fusing consumer Ungroup and reiterate (3)/ParMultiDo(UngroupAndReiterate) into Group by key (3)/MergeBuckets Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.422Z: Fusing consumer Collect end time metrics (3)/ParMultiDo(TimeMonitor) into Ungroup and reiterate (3)/ParMultiDo(UngroupAndReiterate) Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.516Z: Running job using Streaming Engine Jun 19, 2022 2:34:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:33:59.802Z: Starting 16 ****s in us-central1-b... Jun 19, 2022 2:34:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:34:14.407Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Jun 19, 2022 2:34:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:34:36.467Z: Autoscaling: Raised the number of ****s to 16 so that the pipeline can catch up with its backlog and keep up with its input rate. Jun 19, 2022 2:35:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-19T14:35:34.177Z: Workers have started successfully. Jun 19, 2022 2:36:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0 WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not cancel it. To cancel the job in the cloud, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-06-19_07_33_42-10882084210239372597 The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=570d7058-ce82-498c-a630-0da2ea6c55d8, currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 503657 log file: /home/jenkins/.gradle/daemon/7.4/daemon-503657.out.log ----- Last 20 lines from daemon log file - daemon-503657.out.log ----- Jun 19, 2022 2:36:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0 WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not cancel it. To cancel the job in the cloud, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-06-19_07_33_42-10882084210239372597 Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. Remove shutdown hook failed java.lang.IllegalStateException: Shutdown in progress at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82) at java.lang.Runtime.removeShutdownHook(Runtime.java:239) at org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:38) at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208) at org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:365) at org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:108) at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:84) at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64) at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
