See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/395/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Require unique names for stages.

[noreply] Add links to the new RunInference content to Learning Resources 
(#22325)

[noreply] Unskip RunInference IT tests (#22324)

[noreply] cleaned up types in standard_coders.ts (#22316)

[noreply] JMH module for sdks:java:core with benchmarks for

[noreply] Bump cloud.google.com/go/pubsub from 1.23.1 to 1.24.0 in /sdks 
(#22332)

[Luke Cwik] [#22181] Fix java package for SDK java core benchmark

[Luke Cwik] Allow jmhTest to run concurrently with other jmhTest instances

[noreply] [BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)


------------------------------------------
[...truncated 51.49 KB...]
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
fe1ef372bea5: Preparing
b7ab5ebea819: Preparing
0beca1724c6e: Preparing
974cc0b47567: Preparing
072b798337d8: Preparing
3ee3af1c54ec: Preparing
952ae5a48065: Preparing
8f22b3e0481e: Preparing
9ab20763b387: Preparing
7adb3bf205f3: Preparing
8ebc48b73389: Preparing
08f39b61d7b0: Preparing
b4cc62cf51fb: Preparing
6972abca469a: Preparing
fa542bb184ce: Preparing
9791b94d7b98: Preparing
2f1e2f8ca577: Preparing
4dc3dda529a0: Preparing
7372faf8e603: Preparing
952ae5a48065: Waiting
9be7f4e74e71: Preparing
36cd374265f4: Preparing
5bdeef4a08f3: Preparing
8f22b3e0481e: Waiting
9ab20763b387: Waiting
9791b94d7b98: Waiting
b4cc62cf51fb: Waiting
7adb3bf205f3: Waiting
6972abca469a: Waiting
2f1e2f8ca577: Waiting
fa542bb184ce: Waiting
8ebc48b73389: Waiting
4dc3dda529a0: Waiting
08f39b61d7b0: Waiting
3ee3af1c54ec: Waiting
7372faf8e603: Waiting
5bdeef4a08f3: Waiting
9be7f4e74e71: Waiting
072b798337d8: Pushed
b7ab5ebea819: Pushed
0beca1724c6e: Pushed
974cc0b47567: Pushed
fe1ef372bea5: Pushed
8f22b3e0481e: Pushed
952ae5a48065: Pushed
9ab20763b387: Pushed
7adb3bf205f3: Pushed
3ee3af1c54ec: Pushed
08f39b61d7b0: Pushed
9791b94d7b98: Layer already exists
8ebc48b73389: Pushed
6972abca469a: Pushed
fa542bb184ce: Pushed
2f1e2f8ca577: Layer already exists
4dc3dda529a0: Layer already exists
7372faf8e603: Layer already exists
5bdeef4a08f3: Layer already exists
9be7f4e74e71: Layer already exists
36cd374265f4: Layer already exists
b4cc62cf51fb: Pushed
20220720124203: digest: 
sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de size: 
4935

> Task :sdks:java:testing:load-tests:run
Jul 20, 2022 12:43:47 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 20, 2022 12:43:48 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 222 files. Enable logging at DEBUG level to see which 
files will be staged.
Jul 20, 2022 12:43:49 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: 
ParDo(TimeMonitor)
Jul 20, 2022 12:43:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Jul 20, 2022 12:43:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Jul 20, 2022 12:43:55 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 2 
seconds
Jul 20, 2022 12:43:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Jul 20, 2022 12:43:55 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <112523 bytes, hash 
41f4ed071afdf8e7091218880564be25138fc9112f210b1ce5f6b6a818d809ef> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QfTtBxr9-OcJEhiIBWS-JROPyREvIQsc5fa2qBjYCe8.pb
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input as step s1
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s2
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s3
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s4
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s5
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s6
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s7
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s8
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s9
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s10
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s11
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s12
Jul 20, 2022 12:43:59 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s13
Jul 20, 2022 12:44:00 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s14
Jul 20, 2022 12:44:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 20, 2022 12:44:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-20_05_44_00-15238868920995479263?project=apache-beam-testing
Jul 20, 2022 12:44:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-20_05_44_00-15238868920995479263
Jul 20, 2022 12:44:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2022-07-20_05_44_00-15238868920995479263
Jul 20, 2022 12:44:11 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-07-20T12:44:08.860Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20batch0pardo01-jenkins-072012-oqc5. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jul 20, 2022 12:44:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:14.220Z: Worker configuration: e2-standard-2 in 
us-central1-b.
Jul 20, 2022 12:44:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.098Z: Expanding SplittableParDo operations into 
optimizable parts.
Jul 20, 2022 12:44:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.119Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Jul 20, 2022 12:44:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.177Z: Expanding CoGroupByKey operations into 
optimizable parts.
Jul 20, 2022 12:44:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.212Z: Expanding GroupByKey operations into 
optimizable parts.
Jul 20, 2022 12:44:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.285Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.319Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.355Z: Fusing consumer 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.389Z: Fusing consumer 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.416Z: Fusing consumer 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.447Z: Fusing consumer 
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.490Z: Fusing consumer Step: 
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.525Z: Fusing consumer Step: 
1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.609Z: Fusing consumer Step: 
2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.638Z: Fusing consumer Step: 
3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.682Z: Fusing consumer Step: 
4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.714Z: Fusing consumer Step: 
5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.752Z: Fusing consumer Step: 
6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.778Z: Fusing consumer Step: 
7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.809Z: Fusing consumer Step: 
8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.841Z: Fusing consumer Step: 
9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:15.875Z: Fusing consumer 
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 
9/ParMultiDo(CounterOperation)
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:16.230Z: Executing operation Read input/Impulse+Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Jul 20, 2022 12:44:17 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:16.303Z: Starting 5 ****s in us-central1-b...
Jul 20, 2022 12:44:28 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:28.112Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 20, 2022 12:44:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:44:53.123Z: Autoscaling: Raised the number of ****s to 5 
based on the rate of progress in the currently running stage(s).
Jul 20, 2022 12:45:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:45:47.180Z: Workers have started successfully.
Jul 20, 2022 12:46:03 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:03.512Z: Finished operation Read input/Impulse+Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Jul 20, 2022 12:46:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:03.695Z: Executing operation 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)+ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)+Step:
 0/ParMultiDo(CounterOperation)+Step: 1/ParMultiDo(CounterOperation)+Step: 
2/ParMultiDo(CounterOperation)+Step: 3/ParMultiDo(CounterOperation)+Step: 
4/ParMultiDo(CounterOperation)+Step: 5/ParMultiDo(CounterOperation)+Step: 
6/ParMultiDo(CounterOperation)+Step: 7/ParMultiDo(CounterOperation)+Step: 
8/ParMultiDo(CounterOperation)+Step: 
9/ParMultiDo(CounterOperation)+ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor)
Jul 20, 2022 12:46:23 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:21.411Z: Finished operation 
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)+ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)+Step:
 0/ParMultiDo(CounterOperation)+Step: 1/ParMultiDo(CounterOperation)+Step: 
2/ParMultiDo(CounterOperation)+Step: 3/ParMultiDo(CounterOperation)+Step: 
4/ParMultiDo(CounterOperation)+Step: 5/ParMultiDo(CounterOperation)+Step: 
6/ParMultiDo(CounterOperation)+Step: 7/ParMultiDo(CounterOperation)+Step: 
8/ParMultiDo(CounterOperation)+Step: 
9/ParMultiDo(CounterOperation)+ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor)
Jul 20, 2022 12:46:23 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:21.546Z: Cleaning up.
Jul 20, 2022 12:46:23 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:21.671Z: Stopping **** pool...
Jul 20, 2022 12:46:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:57.830Z: Autoscaling: Resized **** pool from 5 to 0.
Jul 20, 2022 12:46:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-20T12:46:57.874Z: Worker pool stopped.
Jul 20, 2022 12:47:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2022-07-20_05_44_00-15238868920995479263 finished with status DONE.
Load test results for test (ID): 0bba3918-6cda-4aa5-b24a-6e44ff166c0b and 
timestamp: 2022-07-20T12:43:49.097000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                    12.667
dataflow_v2_java11_total_bytes_count                     2.0E9

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220720124203
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220720124203]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220720124203] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de
ERROR: (gcloud.container.images.delete) Not found: response: 
{'docker-distribution-api-version': 'registry/2.0', 'content-type': 
'application/json', 'date': 'Wed, 20 Jul 2022 12:47:16 GMT', 'server': 'Docker 
Registry', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status': 
'404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest: 
'sha256:0bb29f800cfb584d48433832333b8acdf837994aa836719341adea5449ba89de': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 298

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with 
> non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 5m 38s
110 actionable tasks: 73 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/qywecjey3i5co

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to