See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/50/display/redirect?page=changes>

Changes:

[emilyye] bump FnAPI container

[noreply] [BEAM-12601] Add append-only option (#15257)

[noreply] Revert "[BEAM-11934] Remove Dataflow override of streaming WriteFiles

[andyxu] Add google cloud heap profiling support to beam java sdk container

[Ismaël Mejía] [BEAM-12628] Add Avro reflect-based Coder option


------------------------------------------
[...truncated 63.74 KB...]

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
08341044b606: Preparing
903d95bdf484: Preparing
6f1e82fceb00: Preparing
427373174f4b: Preparing
8d8e574b2551: Preparing
d209d6e30c68: Preparing
a7aa8d4ca0c3: Preparing
6e042233be87: Preparing
cd68adf8437f: Preparing
e8231fd8e488: Preparing
5fca4478b51f: Preparing
94cf2581f595: Preparing
f42aed5f7feb: Preparing
89819bafde36: Preparing
f3d5b8f65132: Preparing
ad83f0aa5c0a: Preparing
5a9a65095453: Preparing
4b0edb23340c: Preparing
afa3e488a0ee: Preparing
d209d6e30c68: Waiting
a7aa8d4ca0c3: Waiting
4b0edb23340c: Waiting
6e042233be87: Waiting
afa3e488a0ee: Waiting
f3d5b8f65132: Waiting
cd68adf8437f: Waiting
5a9a65095453: Waiting
ad83f0aa5c0a: Waiting
e8231fd8e488: Waiting
89819bafde36: Waiting
5fca4478b51f: Waiting
94cf2581f595: Waiting
903d95bdf484: Pushed
6f1e82fceb00: Pushed
8d8e574b2551: Pushed
08341044b606: Pushed
d209d6e30c68: Pushed
427373174f4b: Pushed
6e042233be87: Pushed
cd68adf8437f: Pushed
f42aed5f7feb: Layer already exists
5fca4478b51f: Pushed
89819bafde36: Layer already exists
f3d5b8f65132: Layer already exists
ad83f0aa5c0a: Layer already exists
5a9a65095453: Layer already exists
4b0edb23340c: Layer already exists
afa3e488a0ee: Layer already exists
94cf2581f595: Pushed
a7aa8d4ca0c3: Pushed
e8231fd8e488: Pushed
20210806125014: digest: 
sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36 size: 
4310

> Task :sdks:java:testing:load-tests:run
Aug 06, 2021 12:54:49 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Aug 06, 2021 12:54:50 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 191 files. Enable logging at DEBUG level to see which 
files will be staged.
Aug 06, 2021 12:54:50 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: 
ParDo(TimeMonitor)
Aug 06, 2021 12:54:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Aug 06, 2021 12:54:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Aug 06, 2021 12:54:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <107918 bytes, hash 
20600528897a3f07b32e0466436a93dbc8ea7dfcf029676909534a6a03e33e85> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IGAFKIl6PwezLgRmQ2qT28jqffzwKWdpCVNKagPjPoU.pb
Aug 06, 2021 12:54:54 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 191 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Aug 06, 2021 12:54:55 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 191 files cached, 0 files newly uploaded in 0 
seconds
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Aug 06, 2021 12:54:55 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6342d610, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@dc4a691, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@784abd3e, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@36c2b646, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@434514d8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6b1dc20f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4613311f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6540cf1d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ec8f4b9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@bc042d5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5484117b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@37df14d1, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7efb53af, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7724704f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3dfa819, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4ce94d2f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68ab0936, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3cd9aa64, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@42b84286, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@443effcb]
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Aug 06, 2021 12:54:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Aug 06, 2021 12:54:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
Aug 06, 2021 12:54:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-06_05_54_55-9626435663495136389?project=apache-beam-testing
Aug 06, 2021 12:54:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-08-06_05_54_55-9626435663495136389
Aug 06, 2021 12:54:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-08-06_05_54_55-9626435663495136389
Aug 06, 2021 12:55:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-08-06T12:55:02.129Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0pardo01-jenkins-08-31p7. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Aug 06, 2021 12:55:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:07.763Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Aug 06, 2021 12:55:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.396Z: Expanding SplittableParDo operations into 
optimizable parts.
Aug 06, 2021 12:55:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.425Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Aug 06, 2021 12:55:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.497Z: Expanding CoGroupByKey operations into 
optimizable parts.
Aug 06, 2021 12:55:08 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.531Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.568Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.601Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.685Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.728Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.773Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.811Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.849Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.882Z: Fusing consumer 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.920Z: Fusing consumer 
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.956Z: Fusing consumer Step: 
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:08.990Z: Fusing consumer Step: 
1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.023Z: Fusing consumer Step: 
2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.052Z: Fusing consumer Step: 
3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.081Z: Fusing consumer Step: 
4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.116Z: Fusing consumer Step: 
5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.149Z: Fusing consumer Step: 
6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.175Z: Fusing consumer Step: 
7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.220Z: Fusing consumer Step: 
8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.245Z: Fusing consumer Step: 
9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.279Z: Fusing consumer 
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 
9/ParMultiDo(CounterOperation)
Aug 06, 2021 12:55:09 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:09.648Z: Starting 5 ****s in us-central1-a...
Aug 06, 2021 12:55:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:40.050Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Aug 06, 2021 12:55:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:55:53.010Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Aug 06, 2021 12:56:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:56:47.306Z: Workers have started successfully.
Aug 06, 2021 12:56:48 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T12:56:47.345Z: Workers have started successfully.
Aug 06, 2021 4:01:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T16:01:01.207Z: Cancel request is committed for workflow job: 
2021-08-06_05_54_55-9626435663495136389.
Aug 06, 2021 4:01:02 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T16:01:02.093Z: Cleaning up.
Aug 06, 2021 4:01:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T16:01:02.219Z: Stopping **** pool...
Aug 06, 2021 4:01:04 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T16:01:02.299Z: Stopping **** pool...
Aug 06, 2021 4:03:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T16:03:13.125Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Aug 06, 2021 4:03:15 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-08-06T16:03:13.161Z: Worker pool stopped.
Aug 06, 2021 4:03:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-08-06_05_54_55-9626435663495136389 finished with status 
CANCELLED.
Aug 06, 2021 4:03:25 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): c77872c6-21af-408f-adc5-1907c7363b2b and 
timestamp: 2021-08-06T12:54:50.547000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
CANCELLED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
        at 
org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210806125014
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210806125014]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210806125014] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:b8f27bcabc4f40fd4a2af5a1715c8e1a58ce2dfe129bb023f45b92647ba32a36].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 13m 28s
107 actionable tasks: 76 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ezdjlijm56fiu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to