See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17/498/display/redirect>

Changes:


------------------------------------------
[...truncated 125.37 KB...]
#5 DONE 0.4s

#6 [ 2/16] ADD target/slf4j-api.jar /opt/apache/beam/jars/
#6 DONE 2.4s

#7 [ 3/16] ADD target/slf4j-jdk14.jar /opt/apache/beam/jars/
#7 DONE 0.1s

#8 [ 4/16] ADD target/jcl-over-slf4j.jar /opt/apache/beam/jars/
#8 DONE 0.1s

#9 [ 5/16] ADD target/log4j-over-slf4j.jar /opt/apache/beam/jars/
#9 DONE 0.1s

#10 [ 6/16] ADD target/log4j-to-slf4j.jar /opt/apache/beam/jars/
#10 DONE 0.1s

#11 [ 7/16] ADD target/beam-sdks-java-harness.jar /opt/apache/beam/jars/
#11 DONE 0.1s

#12 [ 8/16] COPY target/jamm.jar target/open-module-agent*.jar 
/opt/apache/beam/jars/
#12 DONE 0.1s

#13 [ 9/16] COPY target/linux_amd64/boot /opt/apache/beam/
#13 DONE 0.2s

#14 [10/16] COPY target/LICENSE /opt/apache/beam/
#14 DONE 0.1s

#15 [11/16] COPY target/NOTICE /opt/apache/beam/
#15 DONE 0.1s

#16 [12/16] ADD target/third_party_licenses 
/opt/apache/beam/third_party_licenses/
#16 DONE 0.2s

#17 [13/16] COPY target/LICENSE target/options/* /opt/apache/beam/options/
#17 DONE 0.1s

#18 [14/16] COPY target/go-licenses/* 
/opt/apache/beam/third_party_licenses/golang/
#18 DONE 0.1s

#19 [15/16] RUN if [ "true" = "false" ] ; then     rm -rf 
/opt/apache/beam/third_party_licenses ;    fi
#19 DONE 0.4s

#20 [16/16] COPY target/profiler/* /opt/google_cloud_profiler/
#20 DONE 0.1s

#21 exporting to image
#21 exporting layers
#21 exporting layers 0.3s done
#21 writing image 
sha256:4e859766407ef10a3cd454a0ae868727259ef596d9b1bd398f5df68c7d299c43 done
#21 naming to docker.io/apache/beam_java17_sdk:2.48.0.dev done
#21 DONE 0.3s

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
6b65e02e2538: Preparing
5f70bf18a086: Preparing
acecc1add054: Preparing
5868975115f4: Preparing
13e1121b84ae: Preparing
2c8710b6a099: Preparing
1f7856f63f51: Preparing
9acbd89aab41: Preparing
7772534cde79: Preparing
8650b156c5bf: Preparing
00b6d505b32f: Preparing
f1bea880c73d: Preparing
de6a6ba2410b: Preparing
ca32904d52f7: Preparing
d523c43fbe18: Preparing
57b27cfbd6e6: Preparing
b421f864ecf0: Preparing
b09de1b6a666: Preparing
b8a36d10656a: Preparing
f1bea880c73d: Waiting
1f7856f63f51: Waiting
de6a6ba2410b: Waiting
9acbd89aab41: Waiting
ca32904d52f7: Waiting
7772534cde79: Waiting
d523c43fbe18: Waiting
8650b156c5bf: Waiting
57b27cfbd6e6: Waiting
b421f864ecf0: Waiting
00b6d505b32f: Waiting
b8a36d10656a: Waiting
b09de1b6a666: Waiting
2c8710b6a099: Waiting
5f70bf18a086: Layer already exists
5868975115f4: Pushed
acecc1add054: Pushed
2c8710b6a099: Pushed
13e1121b84ae: Pushed
1f7856f63f51: Pushed
7772534cde79: Pushed
9acbd89aab41: Pushed
8650b156c5bf: Pushed
00b6d505b32f: Pushed
f1bea880c73d: Pushed
57b27cfbd6e6: Layer already exists
6b65e02e2538: Pushed
b421f864ecf0: Layer already exists
b09de1b6a666: Layer already exists
b8a36d10656a: Layer already exists
de6a6ba2410b: Pushed
ca32904d52f7: Pushed
d523c43fbe18: Pushed
20230504144835: digest: 
sha256:be1eaf3f8f2599ee6935326caf48b5569da81b640783a501405e1dbe3acd09da size: 
4297

> Task :sdks:java:testing:load-tests:run
May 04, 2023 2:48:58 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 04, 2023 2:48:59 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 229 files. Enable logging at DEBUG level to see which 
files will be staged.
May 04, 2023 2:49:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
May 04, 2023 2:49:02 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 229 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
May 04, 2023 2:49:03 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 229 files cached, 0 files newly uploaded in 0 
seconds
May 04, 2023 2:49:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
May 04, 2023 2:49:03 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <100777 bytes, hash 
13223a589b60ab6c1ad7d309b6c1d2c4f5b1388d0edd7ca2039d6e473625d21e> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EyI6WJtgq2wa19MJtsHSxPWxOI0O3XyiA51uRzYl0h4.pb
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
May 04, 2023 2:49:05 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, 
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, 
endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, 
endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, 
endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, 
endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, 
endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, 
endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, 
endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, 
endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, 
endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, 
endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, 
endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, 
endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, 
endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, 
endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, 
endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, 
endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, 
endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, 
endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, 
endOffset=20000000}]
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics as step s3
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Total bytes monitor as step s4
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Group by key (0) as step s6
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate (0) as step s7
May 04, 2023 2:49:05 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics (0) as step s8
May 04, 2023 2:49:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.48.0-SNAPSHOT
May 04, 2023 2:49:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-04_07_49_05-2111562271709631641?project=apache-beam-testing
May 04, 2023 2:49:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2023-05-04_07_49_05-2111562271709631641
May 04, 2023 2:49:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2023-05-04_07_49_05-2111562271709631641
May 04, 2023 2:49:13 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2023-05-04T14:49:11.820Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java170dataflow0v20streaming0gbk02-jenkins-0504-q07t. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 04, 2023 2:49:19 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:18.764Z: Worker configuration: e2-standard-2 in 
us-central1-b.
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:19.748Z: Expanding SplittableParDo operations into 
optimizable parts.
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:19.780Z: Expanding CollectionToSingleton operations into 
optimizable parts.
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:19.845Z: Expanding CoGroupByKey operations into 
optimizable parts.
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:19.909Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:19.935Z: Expanding GroupByKey operations into streaming 
Read/Write steps
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.004Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.100Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.125Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.156Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.190Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.220Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.254Z: Fusing consumer Collect start time 
metrics/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.287Z: Fusing consumer Total bytes 
monitor/ParMultiDo(ByteMonitor) into Collect start time 
metrics/ParMultiDo(TimeMonitor)
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.320Z: Fusing consumer Window.Into()/Window.Assign 
into Total bytes monitor/ParMultiDo(ByteMonitor)
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.351Z: Fusing consumer Group by key (0)/WriteStream 
into Window.Into()/Window.Assign
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.374Z: Fusing consumer Group by key (0)/MergeBuckets 
into Group by key (0)/ReadStream
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.406Z: Fusing consumer Ungroup and reiterate 
(0)/ParMultiDo(UngroupAndReiterate) into Group by key (0)/MergeBuckets
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.438Z: Fusing consumer Collect end time metrics 
(0)/ParMultiDo(TimeMonitor) into Ungroup and reiterate 
(0)/ParMultiDo(UngroupAndReiterate)
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.532Z: Running job using Streaming Engine
May 04, 2023 2:49:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:20.865Z: Starting 5 ****s in us-central1-b...
May 04, 2023 2:49:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:49:40.294Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 04, 2023 2:50:00 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:50:00.175Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
May 04, 2023 2:51:00 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:50:58.183Z: All ****s have finished the startup processes 
and began to receive work requests.
May 04, 2023 2:51:00 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-05-04T14:50:59.812Z: Workers have started successfully.
May 04, 2023 3:15:57 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2023-05-04T15:15:56.559Z: The ****s of given job are going to be 
updated.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-6' is disconnected.
        at 
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215)
        at 
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
        at com.sun.proxy.$Proxy141.isAlive(Unknown Source)
        at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
        at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
        at hudson.Launcher$ProcStarter.join(Launcher.java:524)
        at hudson.plugins.gradle.Gradle.perform(Gradle.java:321)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
        at hudson.model.Build$BuildExecution.build(Build.java:199)
        at hudson.model.Build$BuildExecution.doRun(Build.java:164)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
        at hudson.model.Run.execute(Run.java:1896)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
        at hudson.model.ResourceController.execute(ResourceController.java:101)
        at hudson.model.Executor.run(Executor.java:442)
Caused by: hudson.remoting.Channel$OrderlyShutdown: Command Close created at
        at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1313)
        at hudson.remoting.Channel$1.handle(Channel.java:606)
        at 
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:81)
Caused by: Command Close created at
        at hudson.remoting.Command.<init>(Command.java:70)
        at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1306)
        at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1304)
        at hudson.remoting.Channel.close(Channel.java:1480)
        at hudson.remoting.Channel.close(Channel.java:1447)
        at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1312)
        ... 2 more
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-6 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to