See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/88/display/redirect>

Changes:


------------------------------------------
[...truncated 40.78 KB...]
> Task :runners:google-cloud-dataflow-java:jar
> Task :sdks:java:container:java11:copyDockerfileDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:java11:copySdkHarnessLauncher
> Task :sdks:java:container:generateLicenseReport

> Task :sdks:java:container:pullLicenses
Copying already-fetched licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. 
Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will 
drop support for Python 3.5 in January 2021. pip 21.0 will remove support for 
this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, 
beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 
six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py>
 
--license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>
        
--output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
        
--dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml>
 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 1.776828 seconds with 16 
threads.
Copying licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container 
--save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
cb48aa7c3179: Preparing
6e9c6659ed75: Preparing
89019dcc934a: Preparing
e500a7c139f7: Preparing
aad82327faab: Preparing
3a1781cb8501: Preparing
aa57da95a980: Preparing
8308d19d28ab: Preparing
eceaf9948f0b: Preparing
ca6d6093da44: Preparing
1646e7e61e2f: Preparing
a4dadbe25bfa: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
eceaf9948f0b: Waiting
3a1781cb8501: Waiting
ca6d6093da44: Waiting
1646e7e61e2f: Waiting
a4dadbe25bfa: Waiting
8308d19d28ab: Waiting
aa57da95a980: Waiting
3891808a925b: Waiting
8555e663f65b: Waiting
d402f4f1b906: Waiting
d00da3cd7763: Waiting
799760671c38: Waiting
aad82327faab: Pushed
6e9c6659ed75: Pushed
89019dcc934a: Pushed
cb48aa7c3179: Pushed
3a1781cb8501: Pushed
e500a7c139f7: Pushed
8308d19d28ab: Pushed
eceaf9948f0b: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
1646e7e61e2f: Pushed
d00da3cd7763: Layer already exists
a4dadbe25bfa: Pushed
799760671c38: Layer already exists
4e61e63529c2: Layer already exists
aa57da95a980: Pushed
ca6d6093da44: Pushed
20210913124334: digest: 
sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0 size: 
4311

> Task :sdks:java:testing:load-tests:run
Sep 13, 2021 12:45:19 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 13, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 195 files. Enable logging at DEBUG level to see which 
files will be staged.
Sep 13, 2021 12:45:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 13, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Sep 13, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Sep 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 
seconds
Sep 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 13, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <106569 bytes, hash 
fb1c024c1df5d5db05a96558d9dc52719b1cd78cffbc301084b5614ebc8c7438> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--xwCTB311dsFqWVY2dxScZsc14z_vDAQhLVhTryMdDg.pb
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Sep 13, 2021 12:45:25 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e7c351d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7b4a0aef]
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Sep 13, 2021 12:45:25 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@39bbd9e0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27fe9713]
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 13, 2021 12:45:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 13, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_45_25-8165500818705185032?project=apache-beam-testing
Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-13_05_45_25-8165500818705185032
Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-09-13_05_45_25-8165500818705185032
Sep 13, 2021 12:45:33 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-13T12:45:32.952Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-3zb5. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 13, 2021 12:45:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:38.295Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Sep 13, 2021 12:45:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-13T12:45:39.084Z: Workflow failed. Causes: Project 
apache-beam-testing has insufficient quota(s) to execute this workflow with 1 
instances in region us-central1. Quota summary (required/available): 1/24386 
instances, 2/0 CPUs, 30/183701 disk GB, 0/2397 SSD disk GB, 1/288 instance 
groups, 1/291 managed instance groups, 1/517 instance templates, 1/615 in-use 
IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about 
requesting more quota.
Sep 13, 2021 12:45:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:39.118Z: Cleaning up.
Sep 13, 2021 12:45:39 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:39.176Z: Worker pool stopped.
Sep 13, 2021 12:45:41 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-13T12:45:40.339Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 13, 2021 12:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-09-13_05_45_25-8165500818705185032 failed with status FAILED.
Sep 13, 2021 12:45:44 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 81bc422d-745c-4dac-85e0-d7a45094d161 and 
timestamp: 2021-09-13T12:45:20.539000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
FAILED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913124334
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913124334]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210913124334] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:13c61f0a3c9b5ea8be327ef9eeb69ec575c0fd95c30c78307c7da6641ab9aae0].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 29s
101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/klzclaj5rpfdc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to