See 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/280/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12769] Adds support for expanding a Java cross-language 
transform


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init 
 > <https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/ws/src> # 
 > timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git --version # 'git version 2.7.4'
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # 
 > timeout=10
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1455c545c2e7a4d89b949ebb75712c30fa996925 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1455c545c2e7a4d89b949ebb75712c30fa996925 # timeout=10
Commit message: "[BEAM-12769] Adds support for expanding a Java cross-language 
transform using the class name and builder methods (#15343)"
 > git rev-list --no-walk c4e0b4ac0777f37f5eb775a8a83c56f66b3baac3 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest
CLUSTER_NAME=beam-loadtests-go-gbk-flink-batch-280
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.12.3/flink-1.12.3-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-gbk-flink-batch-280
GCLOUD_ZONE=us-central1-a

[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_GBK_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins4535990157567968465.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_GBK_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins8485085226004641646.sh
+ cd 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=1.2
+ MASTER_NAME=beam-loadtests-go-gbk-flink-batch-280-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh 
init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][    0.0 B/  2.3 KiB]                                                
/ [1 files][  2.3 KiB/  2.3 KiB]                                                
Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][  2.3 KiB/  6.0 KiB]                                                
/ [2 files][  6.0 KiB/  6.0 KiB]                                                
Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][  6.0 KiB/ 13.5 KiB]                                                
/ [3 files][ 13.5 KiB/ 13.5 KiB]                                                
Operation completed over 3 objects/13.5 KiB.                                    
 
+ create_cluster
+ local 
metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.12.3/flink-1.12.3-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ 
metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest ]]
+ 
metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest ]]
+ 
metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest
+ local image_version=1.2
+ echo 'Starting dataproc cluster. Dataproc version: 1.2'
Starting dataproc cluster. Dataproc version: 1.2
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-go-gbk-flink-batch-280 
--region=global --num-****s=6 --initialization-actions 
gs://beam-flink-cluster/init-actions/docker.sh,gs://beam-flink-cluster/init-actions/beam.sh,gs://beam-flink-cluster/init-actions/flink.sh
 --metadata 
flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.12.3/flink-1.12.3-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest,
 --image-version=1.2 --zone=us-central1-a --quiet
Waiting on operation 
[projects/apache-beam-testing/regions/global/operations/a06b4599-ccb4-3e51-813a-b48d8ad9dc87].
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
.......................................................................................................................................................done.
Created 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-go-gbk-flink-batch-280]
 Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-gbk-flink-batch-280-m '--command=yarn application -list'
++ grep beam-loadtests-go-gbk-flink-batch-280
Warning: Permanently added 'compute.945524409045184194' (ECDSA) to the list of 
known hosts.
21/09/05 14:15:07 INFO client.RMProxy: Connecting to ResourceManager at 
beam-loadtests-go-gbk-flink-batch-280-m/10.128.0.25:8032
+ read line
+ echo application_1630851216889_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
application_1630851216889_0001 flink-dataproc Apache Flink yarn default RUNNING 
UNDEFINED 100% 
http://beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
++ echo application_1630851216889_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
++ sed 's/ .*//'
+ application_ids[$i]=application_1630851216889_0001
++ echo application_1630851216889_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
++ sed 
's/.*beam-loadtests-go-gbk-flink-batch-280/beam-loadtests-go-gbk-flink-batch-280/'
++ sed 's/ .*//'
+ 
application_masters[$i]=beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ 
YARN_APPLICATION_MASTER=beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
+ echo 'Using Yarn Application master: 
beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173'
Using Yarn Application master: 
beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
+ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-gbk-flink-batch-280-m '--command=sudo --user yarn docker 
run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 
--volume ~/.config/gcloud:/root/.config/gcloud 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest 
--flink-master=beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-go-gbk-flink-batch-280'
5d2be72d41fd81509bd76730bc1316641fdf2f2a36770f261719c702cfe846a4
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a 
yarn@beam-loadtests-go-gbk-flink-batch-280-m '--command=curl -s 
"http://beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173/jobmanager/config";'
+ local 
'job_server_config=[{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.12.3.jar"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1630851216889_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal"},{"key":"taskmanager.memory.jvm-metaspace.size","value":"512
 mb"},{"key":"taskmanager.memory.task.off-heap.size","value":"256 
mb"},{"key":"jobmanager.memory.jvm-overhead.min","value":"1073741824b"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1630851216889_0001"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.memory.process.size","value":"12
 
gb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"jobmanager.memory.off-heap.size","value":"134217728b"},{"key":"execution.target","value":"yarn-session"},{"key":"jobmanager.memory.process.size","value":"12
 
gb"},{"key":"web.tmpdir","value":"/tmp/flink-web-3377ea58-ff67-47ba-b168-acfcbf2d7f23"},{"key":"jobmanager.rpc.port","value":"35129"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"taskmanager.memory.managed.fraction","value":"0.5"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal"},{"key":"state.backend","value":"filesystem"},{"key":"jobmanager.memory.jvm-metaspace.size","value":"268435456b"},{"key":"$internal.deployment.config-dir","value":"/usr/lib/flink/conf"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.memory.heap.size","value":"11408506880b"},{"key":"state.checkpoints.dir","value":"gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/checkpoints"},{"key":"jobmanager.memory.jvm-overhead.max","value":"1073741824b"}]'
+ local key=jobmanager.rpc.port
++ echo 
beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
++ cut -d : -f1
+ local 
yarn_application_master_host=beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in 
json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo 
'[{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist_2.11-1.12.3.jar"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1630851216889_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal"},{"key":"taskmanager.memory.jvm-metaspace.size","value":"512'
 'mb"},{"key":"taskmanager.memory.task.off-heap.size","value":"256' 
'mb"},{"key":"jobmanager.memory.jvm-overhead.min","value":"1073741824b"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1630851216889_0001"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.memory.process.size","value":"12'
 
'gb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"jobmanager.memory.off-heap.size","value":"134217728b"},{"key":"execution.target","value":"yarn-session"},{"key":"jobmanager.memory.process.size","value":"12'
 
'gb"},{"key":"web.tmpdir","value":"/tmp/flink-web-3377ea58-ff67-47ba-b168-acfcbf2d7f23"},{"key":"jobmanager.rpc.port","value":"35129"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"taskmanager.memory.managed.fraction","value":"0.5"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal"},{"key":"state.backend","value":"filesystem"},{"key":"jobmanager.memory.jvm-metaspace.size","value":"268435456b"},{"key":"$internal.deployment.config-dir","value":"/usr/lib/flink/conf"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.memory.heap.size","value":"11408506880b"},{"key":"state.checkpoints.dir","value":"gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/checkpoints"},{"key":"jobmanager.memory.jvm-overhead.max","value":"1073741824b"}]'
+ local jobmanager_rpc_port=35129
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n 
gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 
8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-gbk-flink-batch-280-m -- -L 
8081:beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
 -L 
35129:beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:35129
 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  
-Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-gbk-flink-batch-280-m -- -L 
8081:beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
 -L 
35129:beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:35129
 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 
-Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-go-gbk-flink-batch-280-m -- -L 
8081:beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:37173
 -L 
35129:beam-loadtests-go-gbk-flink-batch-280-w-2.c.apache-beam-testing.internal:35129
 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 
-Nf
[beam_LoadTests_Go_GBK_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins8336154365133915913.sh
+ echo '*** Group By Key Go Load test: 2GB of 10B records ***'
*** Group By Key Go Load test: 2GB of 10B records ***
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/ws/src/gradlew>
 -PloadTest.mainClass=group_by_key -Prunner=FlinkRunner 
'-PloadTest.args=--job_name=load-tests-go-flink-batch-gbk-1-0905133525 
--influx_namespace=flink --influx_measurement=go_batch_gbk_1 
--input_options='{"num_records": 200000000,"key_size": 1,"value_size": 9}' 
--iterations=1 --fanout=1 --parallelism=5 --endpoint=localhost:8099 
--environment_type=DOCKER 
--environment_config=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
 --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 
--runner=FlinkRunner' --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx4g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses 
:sdks:go:test:load:run
Starting a Gradle Daemon, 3 busy Daemons could not be reused, use --status for 
details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Task :sdks:go:test:load:goPrepare
Use project GOPATH: 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/ws/src/sdks/go/test/load/.gogradle/project_gopath>

> Task :sdks:go:test:load:resolveBuildDependencies SKIPPED
> Task :sdks:go:test:load:installDependencies SKIPPED
> Task :sdks:go:test:load:buildLinuxAmd64

go: cloud.google.com/go/[email protected]: Get 
"https://proxy.golang.org/cloud.google.com/go/datastore/@v/v1.5.0.mod": 
net/http: TLS handshake timeout

> Task :sdks:go:test:load:buildLinuxAmd64 FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:go:test:load:buildLinuxAmd64'.
> Build failed due to return code 1 of: 
  Command:
   /home/jenkins/.gradle/go/binary/1.16.5/go/bin/go build -o 
./build/bin/linux_amd64/pardo github.com/apache/beam/sdks/v2/go/test/load/pardo
  Env:
   GOEXE=
   
GOPATH=<https://ci-beam.apache.org/job/beam_LoadTests_Go_GBK_Flink_Batch/ws/src/sdks/go/test/load/.gogradle/project_gopath>
   GOROOT=/home/jenkins/.gradle/go/binary/1.16.5/go
   GOOS=linux
   GOARCH=amd64

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 54s
2 actionable tasks: 2 executed

Publishing build scan...
https://gradle.com/s/2owhkwca7sdty

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to