See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17/174/display/redirect>
Changes:
------------------------------------------
[...truncated 424.30 KB...]
8d9a2be7195e: Preparing
99a0b61c52ab: Preparing
d286824e12ea: Preparing
cb58087a8c26: Preparing
b7c223c52431: Preparing
2483ffb6197e: Preparing
147c3587f130: Preparing
0f7ebbbcd024: Preparing
3476f89ccbcc: Preparing
9134a07564c1: Waiting
3bc383470c05: Preparing
c1e64df1ca95: Waiting
e93827457889: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
263924071382: Waiting
a13c519c6361: Preparing
cb58087a8c26: Waiting
fce280a823ee: Waiting
a13c519c6361: Waiting
3b9bc8b71e27: Waiting
b7c223c52431: Waiting
6c4b335ed2a4: Waiting
08fa02ce37eb: Waiting
a037458de4e0: Waiting
2483ffb6197e: Waiting
8d9a2be7195e: Waiting
147c3587f130: Waiting
e93827457889: Waiting
bafdbe68e4ae: Waiting
99a0b61c52ab: Waiting
d286824e12ea: Waiting
3476f89ccbcc: Waiting
3bc383470c05: Waiting
3bc383470c05: Layer already exists
e93827457889: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
9134a07564c1: Pushed
fce280a823ee: Pushed
6c4b335ed2a4: Pushed
c1e64df1ca95: Pushed
263924071382: Pushed
8d9a2be7195e: Pushed
99a0b61c52ab: Pushed
3b9bc8b71e27: Pushed
cb58087a8c26: Pushed
b7c223c52431: Pushed
d286824e12ea: Pushed
2483ffb6197e: Pushed
3476f89ccbcc: Pushed
0f7ebbbcd024: Pushed
147c3587f130: Pushed
20220612150806: digest:
sha256:0765d875eea461af57ae6e6be0201917f6d6e22cf0e8945e2ac436ec43b30046 size:
4729
> Task :sdks:java:testing:load-tests:run
Jun 12, 2022 3:08:44 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jun 12, 2022 3:08:45 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 222 files. Enable logging at DEBUG level to see which
files will be staged.
Jun 12, 2022 3:08:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Jun 12, 2022 3:08:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
Jun 12, 2022 3:08:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0
seconds
Jun 12, 2022 3:08:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Jun 12, 2022 3:08:48 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <98353 bytes, hash
161fab6ff6031050a34bf9e430b8d985c0db289a7f716154f1cb1b334fac8013> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Fh-rb_YDEFCjS_nkMLjZhcDbKJp_cWFU8csbM0-sgBM.pb
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as
step s1
Jun 12, 2022 3:08:50 PM
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0,
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000,
endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000,
endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000,
endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000,
endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000,
endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000,
endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000,
endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000,
endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000,
endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000,
endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000,
endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000,
endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000,
endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000,
endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000,
endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000,
endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000,
endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000,
endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000,
endOffset=20000000}]
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics as step s3
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Total bytes monitor as step s4
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Group by key (0) as step s6
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate (0) as step s7
Jun 12, 2022 3:08:50 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics (0) as step s8
Jun 12, 2022 3:08:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
Jun 12, 2022 3:08:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-06-12_08_08_50-27645284670861806?project=apache-beam-testing
Jun 12, 2022 3:08:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-06-12_08_08_50-27645284670861806
Jun 12, 2022 3:08:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-06-12_08_08_50-27645284670861806
Jun 12, 2022 3:09:02 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-06-12T15:08:56.162Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java170dataflow0v20streaming0gbk07-jenkins-0612-djw4. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jun 12, 2022 3:09:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:06.394Z: Worker configuration: e2-standard-2 in
us-central1-b.
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.210Z: Expanding SplittableParDo operations into
optimizable parts.
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.259Z: Expanding CollectionToSingleton operations into
optimizable parts.
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.322Z: Expanding CoGroupByKey operations into
optimizable parts.
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.383Z: Expanding SplittableProcessKeyed operations
into optimizable parts.
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.454Z: Expanding GroupByKey operations into streaming
Read/Write steps
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.535Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.659Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.695Z: Fusing consumer Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
input/Impulse
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.720Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.741Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.774Z: Fusing consumer Read
input/ParDo(StripIds)/ParMultiDo(StripIds) into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.820Z: Fusing consumer Collect start time
metrics/ParMultiDo(TimeMonitor) into Read
input/ParDo(StripIds)/ParMultiDo(StripIds)
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.871Z: Fusing consumer Total bytes
monitor/ParMultiDo(ByteMonitor) into Collect start time
metrics/ParMultiDo(TimeMonitor)
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.936Z: Fusing consumer Window.Into()/Window.Assign
into Total bytes monitor/ParMultiDo(ByteMonitor)
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:07.974Z: Fusing consumer Group by key (0)/WriteStream
into Window.Into()/Window.Assign
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:08.011Z: Fusing consumer Group by key (0)/MergeBuckets
into Group by key (0)/ReadStream
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:08.046Z: Fusing consumer Ungroup and reiterate
(0)/ParMultiDo(UngroupAndReiterate) into Group by key (0)/MergeBuckets
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:08.069Z: Fusing consumer Collect end time metrics
(0)/ParMultiDo(TimeMonitor) into Ungroup and reiterate
(0)/ParMultiDo(UngroupAndReiterate)
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:08.149Z: Running job using Streaming Engine
Jun 12, 2022 3:09:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:08.361Z: Starting 5 ****s in us-central1-b...
Jun 12, 2022 3:09:14 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:14.319Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jun 12, 2022 3:09:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:09:30.751Z: Autoscaling: Raised the number of ****s to 5 so
that the pipeline can catch up with its backlog and keep up with its input rate.
Jun 12, 2022 3:10:37 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:10:36.647Z: Workers have started successfully.
Jun 12, 2022 3:15:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:15:21.764Z: Cleaning up.
Jun 12, 2022 3:15:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:15:21.838Z: Stopping **** pool...
Jun 12, 2022 3:15:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:15:21.904Z: Stopping **** pool...
Jun 12, 2022 3:15:55 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:15:53.224Z: Autoscaling: Reduced the number of ****s to 0
based on low average **** CPU utilization, and the pipeline having sufficiently
low backlog and keeping up with input rate.
Jun 12, 2022 3:15:55 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-06-12T15:15:53.272Z: Worker pool stopped.
Jun 12, 2022 3:16:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
logTerminalState
INFO: Job 2022-06-12_08_08_50-27645284670861806 finished with status DONE.
Load test results for test (ID): 7f50ad1f-03f1-4c6f-a01c-e37b39a0bbfb and
timestamp: 2022-06-12T15:08:45.633000000Z:
Metric: Value:
dataflow_v2_java17_runtime_sec 161.869
dataflow_v2_java17_total_bytes_count 1.999998E9
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220612150806
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0765d875eea461af57ae6e6be0201917f6d6e22cf0e8945e2ac436ec43b30046
Deleted: sha256:e1bc5b00ca0d3e921a30a069a6c519f3f7370696095203842124beb24832bd40
Deleted: sha256:70efca059d3eb61125674d961f46d09e2ada77964603f8f139da6f80e6ca84ec
Deleted: sha256:106e6df0f7b626bbd0f7ed593cb1db4e98ecda2f45bbfe74d74f2213ec44767b
Deleted: sha256:dd0b5ca6577154fa6247274117336a96267b274ecd1c9924b30fe9d9a38acca6
Deleted: sha256:a08e98681f1f30b78e7f96fbef79301c294865c49f63baa65efdac50a199fd69
Deleted: sha256:e64051bdf5abccf788ebfe427b11d1b83239779341dc8583ba87beb67f376888
Deleted: sha256:765642749830017b1fa070de96c97afabbe711235bccf3299481b61fe1bdd0f5
Deleted: sha256:31783d708abec46d85623d9497ff5aa85ef2045590d725a960b015412948f24b
Deleted: sha256:f37c4a0c820b39db09c33d30ca3366e0435e4982c9b03b023babf5e15158c77b
Deleted: sha256:4c2cd08b285657621d2ab0ffdcaedba8e07c93476a3061ffff92587db7407ac1
Deleted: sha256:1fcee3d0195732ecacbc8b935489bfed0b33583c3d23d41f58c1aff1d6ad5469
Deleted: sha256:19b9625391001a95290a3d05c52e1f16da3c69e727df6da362ab251aab089793
Deleted: sha256:ad828bad5f83f3fd5c2b38171ef5bcc7da9fbdff35b1978fba4bf8f7a448db17
Deleted: sha256:0fb3aa4ff6d17bb5ebfc7d99cb2ec5399f7fb21bc9ea17537a21ddc05c47e63a
Deleted: sha256:922ee62dce474012953342aa1a2a65ea7eeb9a8c79f91f84b51e10587e986de2
Deleted: sha256:f6d1813fc446ef970bc046ca8db62734a5b2043d55414656640245a7011d6131
Deleted: sha256:671a724941ec33db198518118b61b5dadafd3425f92f7de2ff83f91131df7786
Deleted: sha256:ba3657098a7941c2b1172e2fea324ace25ba86b9ceedf7ba5c3e1ec1c8acdcd9
Deleted: sha256:fb85925047b1519bf3d4fd3c4c32725c6c6d4174a8a3e503a3d31e32d4c44ab7
Deleted: sha256:23591b4939b2106be72b3deeffd236253e58073e78aafb829c0e8cc0d04052c0
Deleted: sha256:2c485670aa492ad1f9d5f13ef503763a08cde734f0f957844364af179f8349a0
Deleted: sha256:2dbd1848a329099f304de1298c22ebbaff1431aae2c321dd086869dde3ae7e2b
Deleted: sha256:14c8d37130d760d2bd79c4ae0cbea1c53f18afd9a8178d2c981915c4cb385415
Deleted: sha256:d13202e292f4712d087776f43c3e26aa4fd94d6d09d374df79dadc9504da647c
Deleted: sha256:e7f07e6fb4fa6ab1a971dc693263a76d3153c4ba2cefa21db5a9e3d74f22f17a
Deleted: sha256:6ab67cdf036c4e51b1217f6dda5f7e1f3e68333bd4659718640d57edff36c19b
Deleted: sha256:13ab250fe586cbac4f6eeb8b6b424294680cdc41530916c2081ee76a166b276c
Deleted: sha256:1776b51e2a0fa83c1a94873eaef5da0133840873769cf2ca3d62add9b9e50f00
Deleted: sha256:4a6ff91e7274968a6e12f6d541857ae2e42226d56444c7dd53e0b1bbc10ac5db
Deleted: sha256:9470ff10531f19db734374c8aedbe514340eb63ce77149d094829d274cd70d76
Deleted: sha256:f8f925f75ebf81c25c28cf2a6690661dbf0404f9fc3a32a983af61f8d5354a6a
Deleted: sha256:a65a13e902149d529abab0e1f38bf8f6271ccf36f635969691618636b11ff1b8
Deleted: sha256:49d0fd4509613ae6f618325a21d99ba444e32dd1d8029ebdaa71bf2c9a7973be
Deleted: sha256:02acb26b1508c0191bf3b1f983b03afe67f5f0678794c0369973940087bd6105
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220612150806]
- referencing digest:
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0765d875eea461af57ae6e6be0201917f6d6e22cf0e8945e2ac436ec43b30046]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220612150806]
(referencing
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0765d875eea461af57ae6e6be0201917f6d6e22cf0e8945e2ac436ec43b30046])].
Removing untagged image
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45419480c969733a60534b38124b893b938345e1034c61cc95312bdfb3a0553a
Digests:
-
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:45419480c969733a60534b38124b893b938345e1034c61cc95312bdfb3a0553a
ERROR: (gcloud.container.images.delete) Not found: response:
{'docker-distribution-api-version': 'registry/2.0', 'content-type':
'application/json', 'date': 'Sun, 12 Jun 2022 15:16:08 GMT', 'server': 'Docker
Registry', 'cache-control': 'private', 'x-xss-protection': '0',
'x-frame-options': 'SAMEORIGIN', 'transfer-encoding': 'chunked', 'status':
'404', 'content-length': '168', '-content-encoding': 'gzip'}
Failed to compute blob liveness for manifest:
'sha256:45419480c969733a60534b38124b893b938345e1034c61cc95312bdfb3a0553a': None
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_V2_Streaming_Java17/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
line: 297
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with
> non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 8m 3s
105 actionable tasks: 8 executed, 97 up-to-date
Publishing build scan...
https://gradle.com/s/3rokwyk6nflh6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]