See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/336/display/redirect?page=changes>
Changes:
[yathu] Add labels for typescript PRs
[noreply] Bump google.golang.org/grpc from 1.45.0 to 1.46.2 in /sdks (#17677)
[noreply] [BEAM-13015] Only create a TimerBundleTracker if there are timers.
------------------------------------------
[...truncated 114.23 KB...]
07df3576ab77: Preparing
3b2356b88239: Preparing
a7934564e6b9: Preparing
1b7cceb6a07c: Preparing
b274e8788e0c: Preparing
78658088978a: Preparing
cbd16b72e80e: Waiting
ddc3d733c942: Waiting
b274e8788e0c: Waiting
0b3a543b5350: Waiting
1b7cceb6a07c: Waiting
78658088978a: Waiting
bab690d258dd: Waiting
a7934564e6b9: Waiting
07df3576ab77: Waiting
3b2356b88239: Waiting
478dd83ea650: Waiting
8860a34f73a4: Waiting
5b8d25e9ac10: Waiting
23137aa576df: Waiting
9c8ed07b48ce: Waiting
74098439c9eb: Waiting
83b6b55cace3: Pushed
0d76dbe477dc: Pushed
b482df1b7678: Pushed
316b72dfc7d3: Pushed
dfd8c0a6050e: Pushed
74098439c9eb: Pushed
478dd83ea650: Pushed
ddc3d733c942: Pushed
5b8d25e9ac10: Pushed
9c8ed07b48ce: Pushed
23137aa576df: Pushed
cbd16b72e80e: Layer already exists
8860a34f73a4: Pushed
3b2356b88239: Layer already exists
07df3576ab77: Layer already exists
a7934564e6b9: Layer already exists
1b7cceb6a07c: Layer already exists
b274e8788e0c: Layer already exists
78658088978a: Layer already exists
cd92d00b3703: Pushed
bab690d258dd: Pushed
0b3a543b5350: Pushed
20220521133056: digest:
sha256:e634939857906a1da240c5d092ef397a5132b7687c789c162e687dcb73af7555 size:
4935
> Task :sdks:java:testing:load-tests:run
May 21, 2022 1:31:35 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 21, 2022 1:31:35 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 222 files. Enable logging at DEBUG level to see which
files will be staged.
May 21, 2022 1:31:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
May 21, 2022 1:31:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
May 21, 2022 1:31:38 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
May 21, 2022 1:31:39 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0
seconds
May 21, 2022 1:31:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
May 21, 2022 1:31:39 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <120280 bytes, hash
87316dc780fced381c74ac8885782f3d0cb9751acf411d08d5856fd1ccef4b92> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hzFtx4D87TgcdKyIhXgvPQy5dRrPQR0I1YVv0czvS5I.pb
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as
step s1
May 21, 2022 1:31:41 PM
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0,
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000,
endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000,
endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000,
endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000,
endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000,
endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000,
endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000,
endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000,
endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000,
endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000,
endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000,
endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000,
endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000,
endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000,
endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000,
endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000,
endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000,
endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000,
endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000,
endOffset=20000000}]
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as
step s5
May 21, 2022 1:31:41 PM
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0,
endOffset=100000}, SyntheticUnboundedSource{startOffset=100000,
endOffset=200000}, SyntheticUnboundedSource{startOffset=200000,
endOffset=300000}, SyntheticUnboundedSource{startOffset=300000,
endOffset=400000}, SyntheticUnboundedSource{startOffset=400000,
endOffset=500000}, SyntheticUnboundedSource{startOffset=500000,
endOffset=600000}, SyntheticUnboundedSource{startOffset=600000,
endOffset=700000}, SyntheticUnboundedSource{startOffset=700000,
endOffset=800000}, SyntheticUnboundedSource{startOffset=800000,
endOffset=900000}, SyntheticUnboundedSource{startOffset=900000,
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000,
endOffset=1100000}, SyntheticUnboundedSource{startOffset=1100000,
endOffset=1200000}, SyntheticUnboundedSource{startOffset=1200000,
endOffset=1300000}, SyntheticUnboundedSource{startOffset=1300000,
endOffset=1400000}, SyntheticUnboundedSource{startOffset=1400000,
endOffset=1500000}, SyntheticUnboundedSource{startOffset=1500000,
endOffset=1600000}, SyntheticUnboundedSource{startOffset=1600000,
endOffset=1700000}, SyntheticUnboundedSource{startOffset=1700000,
endOffset=1800000}, SyntheticUnboundedSource{startOffset=1800000,
endOffset=1900000}, SyntheticUnboundedSource{startOffset=1900000,
endOffset=2000000}]
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
May 21, 2022 1:31:41 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
May 21, 2022 1:31:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
May 21, 2022 1:31:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-05-21_06_31_41-13446834764166980564?project=apache-beam-testing
May 21, 2022 1:31:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-05-21_06_31_41-13446834764166980564
May 21, 2022 1:31:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-05-21_06_31_41-13446834764166980564
May 21, 2022 1:31:46 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-05-21T13:31:45.197Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java110dataflow0v20streaming0cogbk02-jenkins-05-1t10. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:51.037Z: Worker configuration: e2-standard-2 in
us-central1-b.
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:51.786Z: Expanding SplittableParDo operations into
optimizable parts.
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:51.874Z: Expanding CollectionToSingleton operations into
optimizable parts.
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:51.956Z: Expanding CoGroupByKey operations into
optimizable parts.
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.023Z: Expanding SplittableProcessKeyed operations
into optimizable parts.
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.052Z: Expanding GroupByKey operations into streaming
Read/Write steps
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.123Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.245Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.273Z: Unzipping flatten CoGroupByKey-Flatten for
input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.300Z: Fusing unzipped copy of
CoGroupByKey/GBK/WriteStream, through flatten CoGroupByKey/Flatten, into
producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.326Z: Fusing consumer CoGroupByKey/GBK/WriteStream
into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.358Z: Fusing consumer Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
input/Impulse
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.391Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.425Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.460Z: Fusing consumer Read
input/ParDo(StripIds)/ParMultiDo(StripIds) into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.493Z: Fusing consumer Collect start time metrics
(input)/ParMultiDo(TimeMonitor) into Read
input/ParDo(StripIds)/ParMultiDo(StripIds)
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.515Z: Fusing consumer Window.Into()/Window.Assign
into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.559Z: Fusing consumer
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into
Window.Into()/Window.Assign
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.634Z: Fusing consumer Read
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
co-input/Impulse
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.685Z: Fusing consumer
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.757Z: Fusing consumer
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.807Z: Fusing consumer Read
co-input/ParDo(StripIds)/ParMultiDo(StripIds) into
Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
May 21, 2022 1:31:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.849Z: Fusing consumer Collect start time metrics
(co-input)/ParMultiDo(TimeMonitor) into Read
co-input/ParDo(StripIds)/ParMultiDo(StripIds)
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.882Z: Fusing consumer Window.Into()2/Window.Assign
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.904Z: Fusing consumer
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into
Window.Into()2/Window.Assign
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.929Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets
into CoGroupByKey/GBK/ReadStream
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.950Z: Fusing consumer
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into
CoGroupByKey/GBK/MergeBuckets
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:52.985Z: Fusing consumer Ungroup and
reiterate/ParMultiDo(UngroupAndReiterate) into
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:53.018Z: Fusing consumer Collect total
bytes/ParMultiDo(ByteMonitor) into Ungroup and
reiterate/ParMultiDo(UngroupAndReiterate)
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:53.041Z: Fusing consumer Collect end time
metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:53.204Z: Running job using Streaming Engine
May 21, 2022 1:31:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:53.429Z: Starting 5 ****s in us-central1-b...
May 21, 2022 1:32:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:31:57.325Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 21, 2022 1:32:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:32:16.326Z: Autoscaling: Raised the number of ****s to 4 so
that the pipeline can catch up with its backlog and keep up with its input rate.
May 21, 2022 1:32:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:32:16.352Z: Resized **** pool to 4, though goal was 5.
This could be a quota issue.
May 21, 2022 1:32:28 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:32:26.542Z: Autoscaling: Raised the number of ****s to 5 so
that the pipeline can catch up with its backlog and keep up with its input rate.
May 21, 2022 1:33:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T13:33:30.225Z: Workers have started successfully.
May 21, 2022 2:09:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T14:09:11.599Z: Cleaning up.
May 21, 2022 2:09:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T14:09:11.722Z: Stopping **** pool...
May 21, 2022 2:09:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T14:09:11.770Z: Stopping **** pool...
May 21, 2022 2:09:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T14:09:46.340Z: Autoscaling: Reduced the number of ****s to 0
based on low average **** CPU utilization, and the pipeline having sufficiently
low backlog and keeping up with input rate.
May 21, 2022 2:09:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-05-21T14:09:46.382Z: Worker pool stopped.
May 21, 2022 2:09:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
logTerminalState
INFO: Job 2022-05-21_06_31_41-13446834764166980564 finished with status DONE.
Load test results for test (ID): 1f0a647d-85a0-4e6c-bf46-76ad842005fd and
timestamp: 2022-05-21T13:31:36.082000000Z:
Metric: Value:
dataflow_v2_java11_runtime_sec 2050.255
dataflow_v2_java11_total_bytes_count 2.199996E9
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220521133056
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e634939857906a1da240c5d092ef397a5132b7687c789c162e687dcb73af7555
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220521133056]
- referencing digest:
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e634939857906a1da240c5d092ef397a5132b7687c789c162e687dcb73af7555]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220521133056]
(referencing
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e634939857906a1da240c5d092ef397a5132b7687c789c162e687dcb73af7555])].
Removing untagged image
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f7bf0b015708f4e4504eb86a8465d5cb9d750bde0e44a97823632a1362040b6e
Digests:
-
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f7bf0b015708f4e4504eb86a8465d5cb9d750bde0e44a97823632a1362040b6e
ERROR: (gcloud.container.images.delete) Not found: response:
{'docker-distribution-api-version': 'registry/2.0', 'content-type':
'application/json', 'date': 'Sat, 21 May 2022 14:10:00 GMT', 'server': 'Docker
Registry', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status':
'404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest:
'sha256:f7bf0b015708f4e4504eb86a8465d5cb9d750bde0e44a97823632a1362040b6e': None
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
line: 297
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with
> non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 39m 5s
105 actionable tasks: 8 executed, 97 up-to-date
Publishing build scan...
https://gradle.com/s/334u4cnwhrtlq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]