See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/1/display/redirect>
Changes: ------------------------------------------ [...truncated 64.10 KB...] 6f4b8d0ccb33: Preparing 2f948c1ec50e: Preparing 91c39303ef95: Preparing 8f3d8e8fcbfb: Preparing 4a967249661f: Preparing ec3443a6202a: Preparing 157a55002223: Preparing fab4fcc1a455: Preparing 9ffdcecb36e4: Preparing 47ff11980281: Preparing d32e0e5cb574: Preparing 15786a1cf1cb: Preparing 6f770cdc9ebf: Preparing 3fc095fab4a2: Preparing 685934357c89: Preparing 47ff11980281: Waiting ccb9b68523fd: Preparing 157a55002223: Waiting 00bcea93703b: Preparing fab4fcc1a455: Waiting 688e187d6c79: Preparing ec3443a6202a: Waiting 9ffdcecb36e4: Waiting 4a967249661f: Waiting 6f770cdc9ebf: Waiting d32e0e5cb574: Waiting 15786a1cf1cb: Waiting 685934357c89: Waiting 3fc095fab4a2: Waiting 688e187d6c79: Waiting 00bcea93703b: Waiting ccb9b68523fd: Waiting 8f3d8e8fcbfb: Pushed 6f4b8d0ccb33: Pushed 2f948c1ec50e: Pushed 4a967249661f: Pushed 159070aa5f25: Pushed 91c39303ef95: Pushed 157a55002223: Pushed fab4fcc1a455: Pushed 15786a1cf1cb: Layer already exists 6f770cdc9ebf: Layer already exists 3fc095fab4a2: Layer already exists 685934357c89: Layer already exists ccb9b68523fd: Layer already exists 00bcea93703b: Layer already exists 47ff11980281: Pushed 688e187d6c79: Layer already exists d32e0e5cb574: Pushed ec3443a6202a: Pushed 9ffdcecb36e4: Pushed 20210618123733: digest: sha256:33f2ae36138cda6168586fe2a8a5cfa5330ddcb938fe0bce0f59da8e3121b29b size: 4310 > Task :sdks:java:testing:load-tests:run Jun 18, 2021 12:46:21 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Jun 18, 2021 12:46:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 190 files. Enable logging at DEBUG level to see which files will be staged. Jun 18, 2021 12:46:22 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Jun 18, 2021 12:46:22 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Jun 18, 2021 12:46:24 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Jun 18, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <107487 bytes, hash 743670d64845c708916902d69f82f231b765968396369481baa782de52a6c3b1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dDZw1khFxwiRaQLWn4LyMbdlloOWNpSBuqeC3lKmw7E.pb Jun 18, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 190 files from PipelineOptions.filesToStage to staging location to prepare for execution. Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 190 files cached, 0 files newly uploaded in 0 seconds Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Jun 18, 2021 12:46:26 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4745e9c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2f2bff16, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75de29c0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fc807c1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@296e281a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59cda16e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5dd903be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12e0f1cb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4a163575, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7e642b88, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b350309, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ecec90d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@588f63c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5a6fa56e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1981d861, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@118ffcfd, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53f4c1e6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74174a23, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6342d610, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dc4a691] Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s4 Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s5 Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 1 as step s6 Jun 18, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 2 as step s7 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 3 as step s8 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 4 as step s9 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 5 as step s10 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 6 as step s11 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 7 as step s12 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 8 as step s13 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 9 as step s14 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s15 Jun 18, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.32.0-SNAPSHOT Jun 18, 2021 12:46:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-18_05_46_27-883050088939975825?project=apache-beam-testing Jun 18, 2021 12:46:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-06-18_05_46_27-883050088939975825 Jun 18, 2021 12:46:28 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-06-18_05_46_27-883050088939975825 Jun 18, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-06-18T12:46:32.120Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo01-jenkins-06-sb5g. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Jun 18, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:35.441Z: Worker configuration: n1-standard-2 in us-central1-f. Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.264Z: Expanding SplittableParDo operations into optimizable parts. Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.288Z: Expanding CollectionToSingleton operations into optimizable parts. Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.356Z: Expanding CoGroupByKey operations into optimizable parts. Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.386Z: Expanding SplittableProcessKeyed operations into optimizable parts. Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.418Z: Expanding GroupByKey operations into streaming Read/Write steps Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.453Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.515Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.543Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.576Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.598Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.629Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.664Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.697Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.732Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.776Z: Fusing consumer Step: 1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.801Z: Fusing consumer Step: 2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.836Z: Fusing consumer Step: 3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.859Z: Fusing consumer Step: 4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.880Z: Fusing consumer Step: 5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.912Z: Fusing consumer Step: 6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.952Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:36.987Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:37.041Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:37.083Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation) Jun 18, 2021 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:46:37.347Z: Starting 5 ****s in us-central1-f... Jun 18, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:47:10.638Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Jun 18, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:47:22.122Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Jun 18, 2021 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:47:55.192Z: Workers have started successfully. Jun 18, 2021 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T12:47:55.226Z: Workers have started successfully. Jun 18, 2021 3:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T15:00:40.560Z: Cleaning up. Jun 18, 2021 3:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T15:00:40.706Z: Stopping **** pool... Jun 18, 2021 3:00:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T15:00:40.787Z: Stopping **** pool... Jun 18, 2021 3:01:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T15:01:48.086Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Jun 18, 2021 3:01:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-06-18T15:01:48.116Z: Worker pool stopped. Jun 18, 2021 3:01:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-06-18_05_46_27-883050088939975825 finished with status DONE. Load test results for test (ID): 9e0a26c7-9edb-4ecf-a9a3-99155dc50e75 and timestamp: 2021-06-18T12:46:21.947000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 7860.64 dataflow_v2_java11_total_bytes_count 3.52237251E11 > Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210618123733 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:33f2ae36138cda6168586fe2a8a5cfa5330ddcb938fe0bce0f59da8e3121b29b Deleted: sha256:78dfd13e477480d4912b053ab21d8b46e716de53a1012337858c26f2ba24e4b6 Deleted: sha256:485b1c691c7eac81c8ea3a4f42f61e85c6e94225844c211ec9e43b0cd1d8ed69 Deleted: sha256:9cc2990db5c8cc9e45ea51128eec47702f67789352391304273da65295335e9a Deleted: sha256:64565ade0ee16a8825a7ab068ba55c2bf2363806307ae5883efa97bfa8d73e4a Deleted: sha256:32a2df9323ab323b1ee9b91323265b19b712b7491c73105ef6c6a6f662c63ee7 Deleted: sha256:d28f211a7f10907ff8a552c18eaf35268d8dc94e2fdd4d34562b7a6db24d8e28 Deleted: sha256:af52b2364c2a534a3e8ebb4b1b8d20b607c24d8fbcf94a44b971c8d4228992ad Deleted: sha256:0723458b0c2a9e18c7c0bc5b6e48f9a774c09197b2ab1a590bedcad3d583f065 Deleted: sha256:ee6ff8cbb907e69349fe1228aaaf27ef5c0fbc39b856568c356bb68293f00fd7 Deleted: sha256:acd6c616f79929af87894c0c659b9ed54d17301132d2e61f8d93d07a4db94cd2 Deleted: sha256:e032fa81b8d8e1a33b568d4e035ce0c0876896b396d33400244e991f45ebc4c2 Deleted: sha256:e4aa5ec75d03e22ddae65d0c494aa43209d01579f52252d13a577e741658ff2f Deleted: sha256:7ac79c7f73ec89167d41f61c84fe1042c05a51fce47e536a0fa9264bf5c90f1d Deleted: sha256:492584d54dfc0f8e6f0898a444601137ba940e369a02f922934c1383ac8e57a3 Deleted: sha256:a70aecc9da55eb7e6cd582bf73771e77e74ec00c46e3eef65537b2359fba70b0 Deleted: sha256:f13b53809af131c2fb7dc53e81c6299567cb10d8cddfd14663d010c40d9efa72 Deleted: sha256:9e1de6a2eaf42f0eb67486f74f67d7b1eea5844ca360bdf2bede569f3af170fc Deleted: sha256:a5e7fd917e45b0077527732b8e757ba3f288ad34dec8ab815fbf63694ca503fc Deleted: sha256:c8a9b4469dee14c78cf3d4cc4ec19b34cce747145ec5eb467acd360b8f9c5e50 Deleted: sha256:6b115d2cf0637062bab0fa08b2f5a0d4c273075b531e3e20281838a80c0802b4 Deleted: sha256:64e1e24e7f4260986880c81ed0918131222f5db6d697ce1709a4bda7f3662c23 Deleted: sha256:19d9ef9f9b779cf02230bc2ba3f3e458534639072506f752acbd09f121feef82 Deleted: sha256:e27cb608c2179fad3b8346234eb635fe058409b28cc16a546b408802ee6cd443 Deleted: sha256:528cc8889c87fe851f1f3153e2c7898964170fabe29592fde367a0d964e826d7 Deleted: sha256:f7dfb4083354caf732e378f219bbe05bbed080aa4a2ec254a333c16949a40ffe Deleted: sha256:50b29db0aeeed29e6e805ab30342fc2cf5fd534bd33ce25c62fe8274fa1220fc Deleted: sha256:d405273bd61f4325251de81427d577fb6c0eccb7326de9b190703e2b567003ba Deleted: sha256:a34d5732eec6b014467edcc3661f2916e24af3f1dc2c7dff6c4d4fb59056d884 ERROR: (gcloud.container.images.delete) [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210618123733] is not a valid name. Expected tag in the form "base:tag" or "tag" or digest in the form "sha256:<digest>" FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 279 * What went wrong: Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'. > Process 'command 'gcloud'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 2h 24m 36s 104 actionable tasks: 74 executed, 28 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/w5a5ltjh2hgec Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
