See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/42/display/redirect?page=changes>

Changes:

[kawaigin] [BEAM-12506] Changed WindowedValueHolder into a Row type

[noreply] [BEAM-1833] Preserve inputs names at graph construction and through


------------------------------------------
[...truncated 66.42 KB...]
d804421bba51: Pushed
8dd76cebc218: Pushed
c1189207c0a6: Pushed
7b35519714b4: Pushed
79e60fbde3ba: Pushed
fcf12960317b: Pushed
f42aed5f7feb: Layer already exists
89819bafde36: Layer already exists
f3d5b8f65132: Layer already exists
ad83f0aa5c0a: Layer already exists
5a9a65095453: Layer already exists
8e6937c7b5f4: Pushed
4b0edb23340c: Layer already exists
afa3e488a0ee: Layer already exists
26adf5cfc60c: Pushed
c3ee2ce7c61f: Pushed
97fd01c7a8b7: Pushed
20210729123732: digest: 
sha256:c4bc1ace1afe78dd8745a3424b85178e2d647c09268eb8d233319c4f02f1f4e1 size: 
4310

> Task :sdks:java:testing:load-tests:run
Jul 29, 2021 12:42:10 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 29, 2021 12:42:11 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 191 files. Enable logging at DEBUG level to see which 
files will be staged.
Jul 29, 2021 12:42:11 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: 
ParDo(TimeMonitor)
Jul 29, 2021 12:42:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Jul 29, 2021 12:42:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Jul 29, 2021 12:42:14 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <107856 bytes, hash 
6256722a1c67131926e22b9d3b9512c7ee73c40f5373fe2835f950b3d542cd18> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YlZyKhxnExkm4iudO5USx-5zxA9Tc_4oNflQs9VCzRg.pb
Jul 29, 2021 12:42:16 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 191 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Jul 29, 2021 12:42:16 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 191 files cached, 0 files newly uploaded in 0 
seconds
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Jul 29, 2021 12:42:16 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ecec90d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@588f63c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a6fa56e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1981d861, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@118ffcfd, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53f4c1e6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74174a23, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6342d610, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dc4a691, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@784abd3e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@36c2b646, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@434514d8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b1dc20f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4613311f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6540cf1d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ec8f4b9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@bc042d5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5484117b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37df14d1, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7efb53af]
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Jul 29, 2021 12:42:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Jul 29, 2021 12:42:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
Jul 29, 2021 12:42:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-07-29_05_42_17-14165250001596814233?project=apache-beam-testing
Jul 29, 2021 12:42:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-07-29_05_42_17-14165250001596814233
Jul 29, 2021 12:42:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-07-29_05_42_17-14165250001596814233
Jul 29, 2021 12:42:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-07-29T12:42:25.155Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0pardo01-jenkins-07-b6ov. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jul 29, 2021 12:42:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:29.168Z: Worker configuration: e2-standard-2 in 
us-central1-b.
Jul 29, 2021 12:42:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:29.831Z: Expanding SplittableParDo operations into 
optimizable parts.
Jul 29, 2021 12:42:29 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:29.900Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.009Z: Expanding CoGroupByKey operations into 
optimizable parts.
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.045Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.074Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.100Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.161Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.180Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.235Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.270Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.341Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.405Z: Fusing consumer 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.439Z: Fusing consumer 
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.471Z: Fusing consumer Step: 
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.504Z: Fusing consumer Step: 
1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.536Z: Fusing consumer Step: 
2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.573Z: Fusing consumer Step: 
3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.633Z: Fusing consumer Step: 
4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.721Z: Fusing consumer Step: 
5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.760Z: Fusing consumer Step: 
6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.794Z: Fusing consumer Step: 
7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.824Z: Fusing consumer Step: 
8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.860Z: Fusing consumer Step: 
9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:30.898Z: Fusing consumer 
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 
9/ParMultiDo(CounterOperation)
Jul 29, 2021 12:42:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:31.159Z: Starting 5 ****s in us-central1-b...
Jul 29, 2021 12:42:55 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:42:55.113Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 29, 2021 12:43:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:43:15.288Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Jul 29, 2021 12:44:11 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:44:09.616Z: Workers have started successfully.
Jul 29, 2021 12:44:11 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T12:44:09.653Z: Workers have started successfully.
Jul 29, 2021 4:00:40 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T16:00:37.820Z: Cancel request is committed for workflow job: 
2021-07-29_05_42_17-14165250001596814233.
Jul 29, 2021 4:00:40 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T16:00:37.898Z: Cleaning up.
Jul 29, 2021 4:00:40 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T16:00:37.947Z: Stopping **** pool...
Jul 29, 2021 4:00:40 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T16:00:37.996Z: Stopping **** pool...
Jul 29, 2021 4:02:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T16:02:57.163Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Jul 29, 2021 4:02:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-07-29T16:02:57.196Z: Worker pool stopped.
Jul 29, 2021 4:03:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-07-29_05_42_17-14165250001596814233 finished with status 
CANCELLED.
Jul 29, 2021 4:03:03 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 8adde985-eb4a-47e9-ad81-6c049e3374f8 and 
timestamp: 2021-07-29T12:42:11.385000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
        at 
org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210729123732
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c4bc1ace1afe78dd8745a3424b85178e2d647c09268eb8d233319c4f02f1f4e1
Deleted: sha256:8283706aabdf0cb16523b4282698c57e57e776bff67a70b8c89eda5bd1f481ed
Deleted: sha256:aee721aa4ce987da8b74c9b375ee9272d5c849f819380d46441275f45bb8ccc5
Deleted: sha256:24854ba7466520c9f471d30c39d622eda34e8e35977a1408a710012873acfa1c
Deleted: sha256:70189087da0d101969d7862b5b3d2e8fe0f624113bdde301200e10049f264082
Deleted: sha256:e7224d11b42011cdc39fc96353b5e49bff95f6736c0851d158598b2a9ca5a7cc
Deleted: sha256:fe46bc86337376a9f2610578daa0df30ccdcf3846ad105d3b4675555d779b7b1
Deleted: sha256:b738487d1974470ec41d008788906382aa3f38d98ba621e64fa0e8406f203e06
Deleted: sha256:9790f838058c3ed9cb1fe86ab6021503a4dea1c5b5677fbf633cc99db9dead58
Deleted: sha256:e1ef8ce7cfef6c0c3bd455244dfe82fef3da8298f9cbb818cf81b8e122ee81e8
Deleted: sha256:e8bd96fe4894f30edddfc8f9b22fc5a4165ddf52cca6e95fbc96362790d50a65
Deleted: sha256:c0c2535789e0a22cdec827d817d5bced6601b39d22b9cbca4dd7e0694fcb3c94
Deleted: sha256:ecc45aca258e3bebcae4940fb47bc626e7acf55372e2287b3dbce44ecf42f157
Deleted: sha256:5017a8d5eb3871088637280ca46b3f0afe89392b3e76f2d4a419c39713d22a3c
Deleted: sha256:98dac53c94e5f28b14b2f2490774546513c75f08ad8d621b608cfd36e84bc5f6
Deleted: sha256:27bde708752776ca75730c25f7433752f93b3baffbe548b6164e0a6264e010cb
Deleted: sha256:04d67b4e6a7445d7d852f043d3fe50e49bdf783e8634fcdee116fdb75077978d
Deleted: sha256:9a0cd6c84a410f1268dce7d413a583e90303b60798708a11e8e372650d4d719c
Deleted: sha256:cbdeccd6611baba82efd3bd32fad080b919926e446ce9f3b01d6ea69532591ba
Deleted: sha256:60ef9c4fdeb37b80b5d1da83092dec1fcf3b82354fd775e97098aadbfab4eb61
Deleted: sha256:481fd34158024957cf142251dd2e7331168bf066fd0addf249f634d94db3529a
Deleted: sha256:a2b9970ea78ca0db8959065f2bb44f7ba83874de6ff7bb1ae25d272659f1c6ae
Deleted: sha256:1d245c976ee97cb042ba3cf6003ec195b6aa812e352b70dbc927701eca6d9842
Deleted: sha256:ab34491ff9889d4a0cd2084ba24880ca33d5df69f0094005aa34efa88a200072
Deleted: sha256:9d4c7e1d12c5d23f9205dd1326f223eee8c3253a75b2161ae993b1577ae75b38
Deleted: sha256:5feb4f2fbd3afe73b9f5af79c1cffbd39357a65153fdea9c5fd198eca6f10f71
Deleted: sha256:812c1cb058fcd250b40d45d73a5d4922630d657a7ae9fcfaf9ff2d23fb2c7bbb
Deleted: sha256:93428f5d6e7146bb4085a9a19a0c88a231867208ff24bf33b9a9c504d6d1b3a4
Deleted: sha256:ec9cff72bafca6b1b2945fd6f24af8c7e0b6a726a3af2fcf1379eb8e46f4f360
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210729123732]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c4bc1ace1afe78dd8745a3424b85178e2d647c09268eb8d233319c4f02f1f4e1]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210729123732] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:c4bc1ace1afe78dd8745a3424b85178e2d647c09268eb8d233319c4f02f1f4e1])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26719ece63b94df0119620793bf549899243a0a5ce994b75b032028df3a42454
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:26719ece63b94df0119620793bf549899243a0a5ce994b75b032028df3a42454
ERROR: (gcloud.container.images.delete) Not found: response: {'status': '404', 
'content-length': '168', 'x-xss-protection': '0', 'transfer-encoding': 
'chunked', 'server': 'Docker Registry', '-content-encoding': 'gzip', 
'docker-distribution-api-version': 'registry/2.0', 'cache-control': 'private', 
'date': 'Thu, 29 Jul 2021 16:03:26 GMT', 'x-frame-options': 'SAMEORIGIN', 
'content-type': 'application/json'}
Failed to compute blob liveness for manifest: 
'sha256:26719ece63b94df0119620793bf549899243a0a5ce994b75b032028df3a42454': None

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 284

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command './scripts/cleanup_untagged_gcr_images.sh'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 26m 9s
107 actionable tasks: 76 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/atfnqo5dcubb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to