See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/87/display/redirect>

Changes:


------------------------------------------
[...truncated 41.96 KB...]
Using base prefix '/usr'
New python executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...
done.
DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. 
Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will 
drop support for Python 3.5 in January 2021. pip 21.0 will remove support for 
this functionality.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
Collecting future<1.0.0,>=0.16.0
  Using cached future-0.18.2-py3-none-any.whl
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: soupsieve, six, tenacity, pyyaml, future, 
beautifulsoup4
Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 
six-1.16.0 soupsieve-2.1 tenacity-5.1.5
Executing 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python>
 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py>
 
--license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json>
        
--output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
        
--dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml>
 
INFO:root:Pulling license for 207 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 1.993543 seconds with 16 
threads.
Copying licenses from 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses>
 to 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.>

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:container:pullLicenses
Finished license_scripts.sh

> Task :sdks:java:container:java11:copyJavaThirdPartyLicenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container 
--save_path=/output/licenses
+ go-licenses csv github.com/apache/beam/sdks/java/container
+ tee /output/licenses/list.csv
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
965419297464: Preparing
af934d8ebe65: Preparing
a7b8054e29c9: Preparing
098bbdafffcb: Preparing
a28016f210da: Preparing
01a67a58c544: Preparing
3dbf36d043e0: Preparing
428bd452b211: Preparing
e387380d47be: Preparing
84336d956992: Preparing
c88bf45cc2c4: Preparing
eb070d43fe59: Preparing
3891808a925b: Preparing
d402f4f1b906: Preparing
00ef5416d927: Preparing
8555e663f65b: Preparing
d00da3cd7763: Preparing
4e61e63529c2: Preparing
799760671c38: Preparing
428bd452b211: Waiting
84336d956992: Waiting
c88bf45cc2c4: Waiting
e387380d47be: Waiting
8555e663f65b: Waiting
01a67a58c544: Waiting
d00da3cd7763: Waiting
eb070d43fe59: Waiting
799760671c38: Waiting
4e61e63529c2: Waiting
3dbf36d043e0: Waiting
00ef5416d927: Waiting
d402f4f1b906: Waiting
a7b8054e29c9: Pushed
a28016f210da: Pushed
af934d8ebe65: Pushed
965419297464: Pushed
01a67a58c544: Pushed
098bbdafffcb: Pushed
428bd452b211: Pushed
e387380d47be: Pushed
3891808a925b: Layer already exists
d402f4f1b906: Layer already exists
00ef5416d927: Layer already exists
8555e663f65b: Layer already exists
3dbf36d043e0: Pushed
d00da3cd7763: Layer already exists
eb070d43fe59: Pushed
4e61e63529c2: Layer already exists
799760671c38: Layer already exists
c88bf45cc2c4: Pushed
84336d956992: Pushed
20210912124340: digest: 
sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4 size: 
4311

> Task :sdks:java:testing:load-tests:run
Sep 12, 2021 12:46:09 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Sep 12, 2021 12:46:10 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 195 files. Enable logging at DEBUG level to see which 
files will be staged.
Sep 12, 2021 12:46:11 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Sep 12, 2021 12:46:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Sep 12, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Sep 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 
seconds
Sep 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Sep 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <106569 bytes, hash 
e9f936bb1b87ac265f45e5791c0ab45d144a7985ab86aa04cca515fdea242295> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6fk2uxuHrCZfReV5HAq0XRRKeYWrhqoEzKUV_eokIpU.pb
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Sep 12, 2021 12:46:16 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d64160c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5f254608, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2eeb0f9b, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1b1c538d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1645f294, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6325f352, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@15c4af7a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6cbd0674, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@55d58825, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@19a64eae, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@29a98d9f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2da3b078, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@544e8149, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fb66650, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a96d94c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a869a16, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae202c6, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46aa712c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ada9c0c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7412ed6b]
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s3
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s5
Sep 12, 2021 12:46:16 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@41853299, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@60d40ff4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2e5b7fba, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@27755487, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f0cab0a, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@fe7b6b0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab4ae59, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77681ce4, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5d96bdf8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f76c2cc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@306f6f1d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d7cac8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6fc6deb7, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@367f0121, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7da39774, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@441b8382, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1df1ced0, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5349b246, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32b0876c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2aaf152b]
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input/StripIds as step s6
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s12
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s14
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s15
Sep 12, 2021 12:46:16 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s16
Sep 12, 2021 12:46:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
Sep 12, 2021 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_46_16-1467752297453580481?project=apache-beam-testing
Sep 12, 2021 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-09-12_05_46_16-1467752297453580481
Sep 12, 2021 12:46:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2021-09-12_05_46_16-1467752297453580481
Sep 12, 2021 12:46:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-09-12T12:46:24.495Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0cogbk01-jenkins-09-4g7j. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Sep 12, 2021 12:46:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:29.782Z: Worker configuration: e2-standard-2 in 
us-central1-a.
Sep 12, 2021 12:46:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2021-09-12T12:46:30.604Z: Workflow failed. Causes: Project 
apache-beam-testing has insufficient quota(s) to execute this workflow with 1 
instances in region us-central1. Quota summary (required/available): 1/24378 
instances, 2/0 CPUs, 30/186331 disk GB, 0/2397 SSD disk GB, 1/280 instance 
groups, 1/283 managed instance groups, 1/508 instance templates, 1/607 in-use 
IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about 
requesting more quota.
Sep 12, 2021 12:46:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:30.642Z: Cleaning up.
Sep 12, 2021 12:46:31 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:30.693Z: Worker pool stopped.
Sep 12, 2021 12:46:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-09-12T12:46:31.857Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Sep 12, 2021 12:46:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2021-09-12_05_46_16-1467752297453580481 failed with status FAILED.
Sep 12, 2021 12:46:38 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk
Load test results for test (ID): 34b8d352-cbb2-47e4-9b8b-76c9e2782560 and 
timestamp: 2021-09-12T12:46:10.674000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                       0.0
dataflow_v2_java11_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
FAILED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912124340
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912124340]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210912124340] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:0e7f1ce6a58536211c674557a311d6d86359f8368b0b2861656162f876ada5f4].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 18s
101 actionable tasks: 71 executed, 28 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/az74l4n3vnsjm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to