See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/180/display/redirect?page=changes>

Changes:

[Daniel Oliveira] [BEAM-13321] Pass TempLocation as pipeline option to Dataflow 
Go for

[Robert Bradshaw] Better type hints for Count combiners.

[Kyle Weaver] Include name of missing tag in error message.

[stranniknm] [BEAM-13423]: fix frontend failure if no examples

[daria.malkova] change return type of 2 methods

[mmack] [BEAM-13441] Use quiet delete for S3 batch deletes. In quiet mode only

[noreply] Updating Grafana from v8.1.2 to v8.1.6

[daria.malkova] Docs for validators tests

[daria.malkova] change context type

[noreply] Merge pull request #16140 from [BEAM-13377][Playground] Update CI/CD

[noreply] Merge pull request #16120 from [BEAM-13333][Playground] Save Python 
logs

[noreply] Merge pull request #16185 from [BEAM-13425][Playground][Bugfix] 
Support

[mmack] [BEAM-13445] Correctly set data limit when flushing S3 upload buffer and

[noreply] Merge pull request #16121 from [BEAM-13334][Playground] Save Go logs 
to

[noreply] Merge pull request #16179 from [BEAM-13344][Playground] support python

[noreply] Merge pull request #16208 from [BEAM-13442][Playground] Filepath to 
log

[noreply] [BEAM-13276] bump jackson-core to 2.13.0 for .test-infra (#16062)

[noreply] Change Pub/Sub Lite PollResult to set explicit watermark (#16216)

[noreply] [BEAM-13454] Fix and test dataframe read_fwf. (#16064)

[noreply] [BEAM-12976] Pipeline visitor to discover pushdown opportunities.

[noreply] [BEAM-13015] Allow decoding a set of elements until we hit the block


------------------------------------------
[...truncated 48.57 KB...]
e21982098f4f: Preparing
4f682cc50979: Preparing
5c81f9330d99: Preparing
927f9fcef4cf: Preparing
1d2dae130748: Waiting
a81f1846a0d2: Preparing
254ae6250ad1: Waiting
b75f0bcd0136: Waiting
4f682cc50979: Waiting
e21982098f4f: Waiting
3b441d7cb46b: Preparing
d3710de04cb3: Preparing
91f7336bbfff: Preparing
e2e8c39e0f77: Preparing
3b441d7cb46b: Waiting
e2e8c39e0f77: Waiting
5c81f9330d99: Waiting
d3710de04cb3: Waiting
a81f1846a0d2: Waiting
927f9fcef4cf: Waiting
4a637e29cb47: Waiting
91f7336bbfff: Waiting
2ea951ca7868: Pushed
1fd712c33a12: Pushed
8bc2012c9074: Pushed
254ae6250ad1: Pushed
c4c69ae44232: Pushed
45c694d280ab: Pushed
b75f0bcd0136: Pushed
4f682cc50979: Pushed
e21982098f4f: Pushed
53754d9b52b5: Pushed
5c81f9330d99: Layer already exists
1d2dae130748: Pushed
927f9fcef4cf: Layer already exists
a81f1846a0d2: Layer already exists
3b441d7cb46b: Layer already exists
d3710de04cb3: Layer already exists
91f7336bbfff: Layer already exists
e2e8c39e0f77: Layer already exists
4a637e29cb47: Pushed
20211214124135: digest: 
sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d size: 
4311

> Task :sdks:java:testing:load-tests:run
Dec 14, 2021 12:44:29 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 14, 2021 12:44:29 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 203 files. Enable logging at DEBUG level to see which 
files will be staged.
Dec 14, 2021 12:44:30 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: 
ParDo(TimeMonitor)
Dec 14, 2021 12:44:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Dec 14, 2021 12:44:32 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 203 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Dec 14, 2021 12:44:33 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 203 files cached, 0 files newly uploaded in 0 
seconds
Dec 14, 2021 12:44:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 14, 2021 12:44:33 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <105887 bytes, hash 
63edd0fc75d84d3cb9fe41391d97e768a3f536e38ab44406d9bdbb9511e61847> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Y-3Q_HXYTTy5_kE5HZfnaKP1NuOKtEQG2b27lRHmGEc.pb
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input as step s1
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s2
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s3
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s4
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s5
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s6
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s7
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s8
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s9
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s10
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s11
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s12
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s13
Dec 14, 2021 12:44:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s14
Dec 14, 2021 12:44:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.36.0-SNAPSHOT
Dec 14, 2021 12:44:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-14_04_44_35-7789330451435596451?project=apache-beam-testing
Dec 14, 2021 12:44:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-12-14_04_44_35-7789330451435596451
Dec 14, 2021 12:44:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-12-14_04_44_35-7789330451435596451
Dec 14, 2021 12:44:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-12-14T12:44:46.837Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20batch0pardo01-jenkins-121412-v242. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 14, 2021 12:45:00 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:44:55.243Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:00.857Z: Expanding SplittableParDo operations into 
optimizable parts.
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.246Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.350Z: Expanding CoGroupByKey operations into 
optimizable parts.
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.442Z: Expanding GroupByKey operations into 
optimizable parts.
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.527Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.562Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.588Z: Fusing consumer 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.624Z: Fusing consumer 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.651Z: Fusing consumer 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.682Z: Fusing consumer 
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.713Z: Fusing consumer Step: 
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.758Z: Fusing consumer Step: 
1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.805Z: Fusing consumer Step: 
2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.844Z: Fusing consumer Step: 
3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.935Z: Fusing consumer Step: 
4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:01.974Z: Fusing consumer Step: 
5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.006Z: Fusing consumer Step: 
6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.042Z: Fusing consumer Step: 
7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.087Z: Fusing consumer Step: 
8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.112Z: Fusing consumer Step: 
9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.153Z: Fusing consumer 
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 
9/ParMultiDo(CounterOperation)
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.581Z: Executing operation Read input/Impulse+Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Dec 14, 2021 12:45:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:02.662Z: Starting 5 ****s in us-central1-a...
Dec 14, 2021 12:45:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:45:28.949Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 14, 2021 12:46:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:03.135Z: Autoscaling: Raised the number of ****s to 5 
based on the rate of progress in the currently running stage(s).
Dec 14, 2021 12:46:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:58.754Z: Workers have started successfully.
Dec 14, 2021 12:46:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:46:58.805Z: Workers have started successfully.
Dec 14, 2021 12:47:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:47:21.175Z: Finished operation Read input/Impulse+Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Dec 14, 2021 12:47:21 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:47:21.349Z: Executing operation 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)+ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)+Step:
 0/ParMultiDo(CounterOperation)+Step: 1/ParMultiDo(CounterOperation)+Step: 
2/ParMultiDo(CounterOperation)+Step: 3/ParMultiDo(CounterOperation)+Step: 
4/ParMultiDo(CounterOperation)+Step: 5/ParMultiDo(CounterOperation)+Step: 
6/ParMultiDo(CounterOperation)+Step: 7/ParMultiDo(CounterOperation)+Step: 
8/ParMultiDo(CounterOperation)+Step: 
9/ParMultiDo(CounterOperation)+ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor)
Dec 14, 2021 12:47:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:47:40.387Z: Finished operation 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)+ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)+Step:
 0/ParMultiDo(CounterOperation)+Step: 1/ParMultiDo(CounterOperation)+Step: 
2/ParMultiDo(CounterOperation)+Step: 3/ParMultiDo(CounterOperation)+Step: 
4/ParMultiDo(CounterOperation)+Step: 5/ParMultiDo(CounterOperation)+Step: 
6/ParMultiDo(CounterOperation)+Step: 7/ParMultiDo(CounterOperation)+Step: 
8/ParMultiDo(CounterOperation)+Step: 
9/ParMultiDo(CounterOperation)+ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor)
Dec 14, 2021 12:47:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:47:40.586Z: Cleaning up.
Dec 14, 2021 12:47:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:47:40.683Z: Stopping **** pool...
Dec 14, 2021 12:50:06 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:50:04.419Z: Autoscaling: Resized **** pool from 5 to 0.
Dec 14, 2021 12:50:06 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-12-14T12:50:04.469Z: Worker pool stopped.
Dec 14, 2021 12:50:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-12-14_04_44_35-7789330451435596451 finished with status DONE.
Load test results for test (ID): 5778336a-4a26-49a4-974e-b5616d316c3f and 
timestamp: 2021-12-14T12:44:30.284000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                    14.161
dataflow_v2_java11_total_bytes_count                     2.0E9

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124135
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d
Deleted: sha256:fdd36df72bb63fa481b9d291df28b495a8efb48e448d7d23137348e72e021470
Deleted: sha256:9fb7c02f65222047436804dbca71f19fc0f226582b891fa3881570d5aaad5ccb
Deleted: sha256:6fd1a5a5050088c2a0f24145a388af162f99de6b12233d35ed9656226244d6d5
Deleted: sha256:17a973a484e5466ef8e75f7ce2240d01e50fb930e05473517aa16a379d287c86
Deleted: sha256:0af0987974d1709797a0968d38e05b6db698d51408dc0e1a7c99b817dbf49eff
Deleted: sha256:9f27274d02703c1d98222b89bb1cc64c59325d6596c24db5607439eda5e028f3
Deleted: sha256:1a95b1ef389ccf4e6e9b7c28d3137307f8b0494b3dcbe5ef4dec1bb0a078c3d2
Deleted: sha256:056d38decb80c0330e200b2d09ab233bea89bbd35a634d5c0893221b93cbd622
Deleted: sha256:d7d5b2f353a3bc46dd06a122dbd75dbd43760d5a566dfabfd0facbcecabf9175
Deleted: sha256:db2c30d7ef89d72585339a4696a287aca2f8a4c0fbd68d1e4b8831e41961f833
Deleted: sha256:f730de0488f96acd44de772c8252e071a8cb37a29011965b7b5f8c9a67c9a08f
Deleted: sha256:91b567a9726418857fd4101cdfa98f38dada2e542d2c505e2c65805be04038a8
Deleted: sha256:5777173f409593e9f8267b44024a04fb75df5b813494419da598dee9f520e0dd
Deleted: sha256:f4555ae895af9b4008bf6bfa57613f2af70350c5b07d63ea360c3d5288e0c66d
Deleted: sha256:94c6dfe623c806da4712718535f30ccda430f6570a2037f1544d212a537e0677
Deleted: sha256:f818cfec6951a7f286e6bd6b74230c4be42adbef3cfea0ef4ab5b37d4422b0f3
Deleted: sha256:f09e21bd428ca8a9bf8a0e5043cb88ec085b8de971e03490d61c8af90a4395c5
Deleted: sha256:b842a970bef2bc5afaa72e711793e915751be8ee3127f78323a85f10dbf30235
Deleted: sha256:bc2a4b1da58eddc38f718cde7bdda2eb406d6e5f6375d66c103b757c18f5a813
Deleted: sha256:df7d70acac90ac61a9c4bf73166642dd9d5f64584626c17996e1dc18c1fea561
Deleted: sha256:db60bc58e54e8b66ded4ee051fa261bc03bcb1abbd4d7cda5a3f48d265d285ef
Deleted: sha256:6719a6e137601836827a4c8baa3464f452ec9b47a3fa4a2915fc4c9514794993
Deleted: sha256:53e364301adbf7ca68ccf4cdcf95400d9a3a455f2cfa3129725d1b88f32fc985
Deleted: sha256:eddb2b5031bbd7f239eb47838b493f6ead274933a3f74ba2d38434115b886d05
Deleted: sha256:a346be456a4909b5ce813ac4825de18346da513d0cb776b2a02242a23072c63b
Deleted: sha256:6c362a2445c754444e27314ee223aa87339541b9289e5401cf2d564f117869e1
Deleted: sha256:a43ad3f98991fb66820499980140408bfc79d38014583717bea5a3ceee025e9c
Deleted: sha256:826e6a8efe0ae1aada4bcff306711c1901696651ff8d0db0e5204d1e3287f61a
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124135]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211214124135] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0935b191f60f390e8a4d74362d15d92ed938f15aaaf4fb332995d20a5aa7f84d].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:514a58b0a365bc8ae9553b00e7501f9dfda8a741471c089c73c856a4a9358915
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:514a58b0a365bc8ae9553b00e7501f9dfda8a741471c089c73c856a4a9358915
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 
'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 
'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 
'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 
'date': 'Tue, 14 Dec 2021 12:50:22 GMT', 'x-frame-options': 'SAMEORIGIN', 
'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 
'sha256:514a58b0a365bc8ae9553b00e7501f9dfda8a741471c089c73c856a4a9358915': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 281

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 0s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zad5zvt7yifja

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to