See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java17/476/display/redirect>
Changes: ------------------------------------------ [...truncated 126.75 KB...] #19 [15/16] RUN if [ "true" = "false" ] ; then rm -rf /opt/apache/beam/third_party_licenses ; fi #19 DONE 0.4s #20 [16/16] COPY target/profiler/* /opt/google_cloud_profiler/ #20 DONE 0.1s #21 exporting to image #21 exporting layers #21 exporting layers 0.2s done #21 writing image sha256:667c8638632f576bfba1a8070fbbaf529c945f679c13edf8cbca7893721cfebe done #21 naming to docker.io/apache/beam_java17_sdk:2.48.0.dev done #21 DONE 0.2s > Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] 29f948bbcaa4: Preparing 5f70bf18a086: Preparing 685926e51f07: Preparing 19545a83c610: Preparing f1a7f57bc51b: Preparing a04e95b7f001: Preparing 3845995e0a02: Preparing f0ea907fa32b: Preparing e8c07f5ccbe2: Preparing 1ff2491178fb: Preparing a72e01b65c43: Preparing 9c3c85a3ae01: Preparing 94aadcdf3a57: Preparing 1ca5efe25a90: Preparing 9f165f2a94d2: Preparing 47a0b225116f: Preparing c6813289044b: Preparing f3a12c51479f: Preparing b93c1bd012ab: Preparing f0ea907fa32b: Waiting 1ca5efe25a90: Waiting 9f165f2a94d2: Waiting 9c3c85a3ae01: Waiting a72e01b65c43: Waiting 47a0b225116f: Waiting 94aadcdf3a57: Waiting c6813289044b: Waiting e8c07f5ccbe2: Waiting 1ff2491178fb: Waiting f3a12c51479f: Waiting b93c1bd012ab: Waiting 3845995e0a02: Waiting a04e95b7f001: Waiting 5f70bf18a086: Layer already exists a04e95b7f001: Pushed 19545a83c610: Pushed 685926e51f07: Pushed f1a7f57bc51b: Pushed 29f948bbcaa4: Pushed 3845995e0a02: Pushed e8c07f5ccbe2: Pushed a72e01b65c43: Pushed 1ff2491178fb: Pushed 9c3c85a3ae01: Pushed 47a0b225116f: Layer already exists c6813289044b: Layer already exists f3a12c51479f: Layer already exists 94aadcdf3a57: Pushed b93c1bd012ab: Layer already exists f0ea907fa32b: Pushed 1ca5efe25a90: Pushed 9f165f2a94d2: Pushed 20230412132247: digest: sha256:f03416b4841eef7087a5a32ad47ad70a82878f26d408e5c515e922cd7a5707c8 size: 4297 > Task :sdks:java:testing:load-tests:run Apr 12, 2023 1:23:13 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Apr 12, 2023 1:23:14 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 230 files. Enable logging at DEBUG level to see which files will be staged. Apr 12, 2023 1:23:15 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Apr 12, 2023 1:23:15 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Apr 12, 2023 1:23:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 230 files from PipelineOptions.filesToStage to staging location to prepare for execution. Apr 12, 2023 1:23:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 230 files cached, 0 files newly uploaded in 0 seconds Apr 12, 2023 1:23:18 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Apr 12, 2023 1:23:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <123104 bytes, hash a1a4024a0b3077fc5fb77cad8a29bfd87b74eca8c4aead6b2866692c3cf0991a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oaQCSgswd_xft3ytiim_2Ht07KjErq1rKGZpLDzwmRo.pb Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Apr 12, 2023 1:23:19 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}] Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Apr 12, 2023 1:23:19 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=100000}, SyntheticUnboundedSource{startOffset=100000, endOffset=200000}, SyntheticUnboundedSource{startOffset=200000, endOffset=300000}, SyntheticUnboundedSource{startOffset=300000, endOffset=400000}, SyntheticUnboundedSource{startOffset=400000, endOffset=500000}, SyntheticUnboundedSource{startOffset=500000, endOffset=600000}, SyntheticUnboundedSource{startOffset=600000, endOffset=700000}, SyntheticUnboundedSource{startOffset=700000, endOffset=800000}, SyntheticUnboundedSource{startOffset=800000, endOffset=900000}, SyntheticUnboundedSource{startOffset=900000, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=1100000}, SyntheticUnboundedSource{startOffset=1100000, endOffset=1200000}, SyntheticUnboundedSource{startOffset=1200000, endOffset=1300000}, SyntheticUnboundedSource{startOffset=1300000, endOffset=1400000}, SyntheticUnboundedSource{startOffset=1400000, endOffset=1500000}, SyntheticUnboundedSource{startOffset=1500000, endOffset=1600000}, SyntheticUnboundedSource{startOffset=1600000, endOffset=1700000}, SyntheticUnboundedSource{startOffset=1700000, endOffset=1800000}, SyntheticUnboundedSource{startOffset=1800000, endOffset=1900000}, SyntheticUnboundedSource{startOffset=1900000, endOffset=2000000}] Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Apr 12, 2023 1:23:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.48.0-SNAPSHOT Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-12_06_23_20-11247026710369581374?project=apache-beam-testing Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2023-04-12_06_23_20-11247026710369581374 Apr 12, 2023 1:23:20 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2023-04-12_06_23_20-11247026710369581374 Apr 12, 2023 1:23:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2023-04-12T13:23:26.175Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java170dataflow0v20streaming0cogbk02-jenkins-04-le69. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Apr 12, 2023 1:23:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:33.806Z: Worker configuration: e2-standard-2 in us-central1-b. Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:34.938Z: Expanding SplittableParDo operations into optimizable parts. Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:34.957Z: Expanding CollectionToSingleton operations into optimizable parts. Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.018Z: Expanding CoGroupByKey operations into optimizable parts. Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.087Z: Expanding SplittableProcessKeyed operations into optimizable parts. Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.123Z: Expanding GroupByKey operations into streaming Read/Write steps Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.187Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.293Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.324Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.347Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.372Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.393Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.420Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.455Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.487Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.516Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.542Z: Fusing consumer Window.Into()/Window.Assign into Collect start time metrics (input)/ParMultiDo(TimeMonitor) Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.576Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into Window.Into()/Window.Assign Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.600Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read co-input/Impulse Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.632Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.657Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Apr 12, 2023 1:23:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.679Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.713Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds) Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.757Z: Fusing consumer Window.Into()2/Window.Assign into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor) Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.790Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into Window.Into()2/Window.Assign Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.828Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.850Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into CoGroupByKey/GBK/MergeBuckets Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.880Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.911Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor) into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate) Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:35.934Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor) Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:36.088Z: Running job using Streaming Engine Apr 12, 2023 1:23:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:23:36.429Z: Starting 5 ****s in us-central1-b... Apr 12, 2023 1:24:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:24:02.367Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Apr 12, 2023 1:24:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:24:18.659Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Apr 12, 2023 1:25:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:25:07.865Z: Workers have started successfully. Apr 12, 2023 1:25:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2023-04-12T13:25:08.048Z: All ****s have finished the startup processes and began to receive work requests. FATAL: command execution failed java.io.IOException: Backing channel 'apache-beam-jenkins-2' is disconnected. at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215) at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285) at com.sun.proxy.$Proxy139.isAlive(Unknown Source) at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215) at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207) at hudson.Launcher$ProcStarter.join(Launcher.java:524) at hudson.plugins.gradle.Gradle.perform(Gradle.java:317) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814) at hudson.model.Build$BuildExecution.build(Build.java:199) at hudson.model.Build$BuildExecution.doRun(Build.java:164) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522) at hudson.model.Run.execute(Run.java:1896) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44) at hudson.model.ResourceController.execute(ResourceController.java:101) at hudson.model.Executor.run(Executor.java:442) Caused by: hudson.remoting.Channel$OrderlyShutdown: Command Close created at at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1313) at hudson.remoting.Channel$1.handle(Channel.java:606) at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:81) Caused by: Command Close created at at hudson.remoting.Command.<init>(Command.java:70) at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1306) at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1304) at hudson.remoting.Channel.close(Channel.java:1480) at hudson.remoting.Channel.close(Channel.java:1447) at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1312) ... 2 more Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure ERROR: apache-beam-jenkins-2 is offline; cannot locate jdk_1.8_latest --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
