See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/737/display/redirect?page=changes>
Changes: [Kenneth Knowles] Exclude IOs already split from Java Precommit job [Kenneth Knowles] Move expansion services into appropriate precommits [Kenneth Knowles] Split more IOs out of Java precommit [Kenneth Knowles] Fix trigger paths for separated IOs [Kenneth Knowles] Turn rawtype checking back on for core Java SDK [noreply] [Tour Of Beam] Playground Router GRPC API host (#24542) [noreply] Bump golang.org/x/net from 0.3.0 to 0.4.0 in /sdks (#24587) [noreply] Replaced finalize with DoFn Teardown in Neo4jIO (#24571) [Kenneth Knowles] Simplify bug report templates [Kenneth Knowles] Fix bugs in issue template yml [noreply] Fix issue templates (#24597) [noreply] [#24024] Stop wrapping light weight functions with Contextful as they [noreply] Sample window size as well (#24388) [noreply] Implement Kafka Write Schema Transform (#24495) [Kenneth Knowles] Eliminate null errors from JdbcIO [noreply] docs(fix): Filter.whereFieldName(s?) -> [hiandyzhang] ElasticsearchIO: Lower log level in flushBatch to avoid noisy log ------------------------------------------ [...truncated 604 B...] Cloning repository https://github.com/apache/beam.git > git init > <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src> > # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git --version # timeout=10 > git --version # 'git version 2.25.1' > git fetch --tags --force --progress -- https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # > timeout=10 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10 Fetching upstream changes from https://github.com/apache/beam.git > git fetch --tags --force --progress -- https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # > timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision 58b4d46655d94374f3d3564752dc12eb98b95456 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 58b4d46655d94374f3d3564752dc12eb98b95456 # timeout=10 Commit message: "Merge pull request #24574: Turn rawtype checking back on for core Java SDK" > git rev-list --no-walk 80980b8be48ece9c6d61dc28f429374b8e7a0e4b # timeout=10 No emails were triggered. [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content SETUPTOOLS_USE_DISTUTILS=stdlib SPARK_LOCAL_IP=127.0.0.1 [EnvInject] - Variables injected successfully. [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content CLUSTER_NAME=beam-loadtests-go-combine-flink-batch-737 FLINK_NUM_WORKERS=5 DETACHED_MODE=true JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest GCS_BUCKET=gs://beam-flink-cluster HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar GCLOUD_ZONE=us-central1-a FLINK_TASKMANAGER_SLOTS=1 FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-737 [EnvInject] - Variables injected successfully. [beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6000559029918395104.sh + echo Setting up flink cluster Setting up flink cluster [beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3392836023063522059.sh + cd <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/.test-infra/dataproc> + ./flink_cluster.sh create + GCLOUD_ZONE=us-central1-a + DATAPROC_VERSION=preview-debian11 ++ echo us-central1-a ++ sed -E 's/(-[a-z])?$//' + GCLOUD_REGION=us-central1 + MASTER_NAME=beam-loadtests-go-combine-flink-batch-737-m + INIT_ACTIONS_FOLDER_NAME=init-actions + FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh + BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh + DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh + FLINK_LOCAL_PORT=8081 + FLINK_TASKMANAGER_SLOTS=1 + YARN_APPLICATION_MASTER= + create + upload_init_actions + echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster' Uploading initialization actions to GCS bucket: gs://beam-flink-cluster + gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions Copying file://init-actions/beam.sh [Content-Type=text/x-sh]... / [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]... / [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]... / [2 files][ 6.0 KiB/ 13.5 KiB] / [3 files][ 13.5 KiB/ 13.5 KiB] - Operation completed over 3 objects/13.5 KiB. + create_cluster + local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz, + metadata+=flink-start-yarn-session=true, + metadata+=flink-taskmanager-slots=1, + metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar + [[ -n gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest ]] + metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]] + metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest + local image_version=preview-debian11 + echo 'Starting dataproc cluster. Dataproc version: preview-debian11' Starting dataproc cluster. Dataproc version: preview-debian11 + gcloud dataproc clusters create beam-loadtests-go-combine-flink-batch-737 --region=us-central1 --num-****s=5 --master-machine-type=n1-standard-2 --****-machine-type=n1-standard-2 --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest, --image-version=preview-debian11 --zone=us-central1-a --optional-components=FLINK,DOCKER --quiet Waiting on operation [projects/apache-beam-testing/regions/us-central1/operations/cf5406ff-6a24-381d-96c6-6199e0de6783]. Waiting for cluster creation operation... WARNING: Consider using Auto Zone rather than selecting a zone manually. See https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/auto-zone .................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done. Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/us-central1/clusters/beam-loadtests-go-combine-flink-batch-737] Cluster placed in zone [us-central1-a]. + get_leader + local i=0 + local application_ids + local application_masters + echo 'Yarn Applications' Yarn Applications ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m '--command=yarn application -list' ++ grep 'Apache Flink' Writing 3 keys to /home/jenkins/.ssh/google_compute_known_hosts 2022-12-09 08:44:20,200 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at beam-loadtests-go-combine-flink-batch-737-m.c.apache-beam-testing.internal./10.128.0.195:8032 2022-12-09 08:44:20,475 INFO client.AHSProxy: Connecting to Application History server at beam-loadtests-go-combine-flink-batch-737-m.c.apache-beam-testing.internal./10.128.0.195:10200 + read line + echo application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803 application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803 ++ echo application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803 ++ sed 's/ .*//' + application_ids[$i]=application_1670575344313_0002 ++ echo application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803 ++ sed -E 's#.*(https?://)##' ++ sed 's/ .*//' + application_masters[$i]=10.128.0.193:32803 + i=1 + read line + '[' 1 '!=' 1 ']' + YARN_APPLICATION_MASTER=10.128.0.193:32803 + echo 'Using Yarn Application master: 10.128.0.193:32803' Using Yarn Application master: 10.128.0.193:32803 + [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]] + start_job_server + gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest --flink-master=10.128.0.193:32803 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-737' Existing host keys found in /home/jenkins/.ssh/google_compute_known_hosts Unable to find image 'gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest' locally latest: Pulling from apache-beam-testing/beam_portability/beam_flink1.15_job_server 001c52e26ad5: Pulling fs layer d9d4b9b6e964: Pulling fs layer 2068746827ec: Pulling fs layer 9daef329d350: Pulling fs layer d85151f15b66: Pulling fs layer 52a8c426d30b: Pulling fs layer 8754a66e0050: Pulling fs layer f84eb606444d: Pulling fs layer 0f3b111e627c: Pulling fs layer 6f880a280a05: Pulling fs layer 87c4199424f5: Pulling fs layer 4bfecfd5da75: Pulling fs layer 9daef329d350: Waiting d85151f15b66: Waiting 52a8c426d30b: Waiting 8754a66e0050: Waiting f84eb606444d: Waiting 0f3b111e627c: Waiting 6f880a280a05: Waiting 87c4199424f5: Waiting 4bfecfd5da75: Waiting d9d4b9b6e964: Download complete 2068746827ec: Verifying Checksum 2068746827ec: Download complete d85151f15b66: Verifying Checksum d85151f15b66: Download complete 52a8c426d30b: Verifying Checksum 52a8c426d30b: Download complete 001c52e26ad5: Verifying Checksum 001c52e26ad5: Download complete 9daef329d350: Download complete f84eb606444d: Verifying Checksum f84eb606444d: Download complete 6f880a280a05: Verifying Checksum 6f880a280a05: Download complete 87c4199424f5: Verifying Checksum 87c4199424f5: Download complete 4bfecfd5da75: Verifying Checksum 4bfecfd5da75: Download complete 8754a66e0050: Verifying Checksum 8754a66e0050: Download complete 0f3b111e627c: Verifying Checksum 0f3b111e627c: Download complete 001c52e26ad5: Pull complete d9d4b9b6e964: Pull complete 2068746827ec: Pull complete 9daef329d350: Pull complete d85151f15b66: Pull complete 52a8c426d30b: Pull complete 8754a66e0050: Pull complete f84eb606444d: Pull complete 0f3b111e627c: Pull complete 6f880a280a05: Pull complete 87c4199424f5: Pull complete 4bfecfd5da75: Pull complete Digest: sha256:e00cc03108c819670154f58e0003f936edd94b0e359dd0891c812504a6b33b2c Status: Downloaded newer image for gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest 6e9f402ee4646f465178ee760ba506ee26c8a0f5bcf1e09f1ff6088eccb32886 + start_tunnel ++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-go-combine-flink-batch-737-m '--command=curl -s "http://10.128.0.193:32803/jobmanager/config"' Existing host keys found in /home/jenkins/.ssh/google_compute_known_hosts + local 'job_server_config=[{"key":"blob.server.port","value":"41871"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist-1.15.0.jar"},{"key":"classloader.check-leaked-classloader","value":"False"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1670575344313_0002"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.memory.jvm-overhead.min","value":"611948962b"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1670575344313_0002"},{"key":"taskmanager.network.numberOfBuffers","value":"4096"},{"key":"parallelism.default","value":"8"},{"key":"taskmanager.numberOfTaskSlots","value":"2"},{"key":"env.hadoop.conf.dir","value":"/etc/hadoop/conf"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"5836"},{"key":"taskmanager.memory.process.size","value":"5836 mb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"5836"},{"key":"jobmanager.memory.off-heap.size","value":"134217728b"},{"key":"execution.target","value":"yarn-session"},{"key":"jobmanager.memory.process.size","value":"5836 mb"},{"key":"web.tmpdir","value":"/tmp/flink-web-712a957a-e616-4cc7-9558-3667975ac359"},{"key":"jobmanager.rpc.port","value":"43491"},{"key":"rest.bind-address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"rest.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"jobmanager.memory.jvm-metaspace.size","value":"268435456b"},{"key":"$internal.deployment.config-dir","value":"/usr/lib/flink/conf"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.memory.heap.size","value":"5104887390b"},{"key":"jobmanager.memory.jvm-overhead.max","value":"611948962b"}]' + local key=jobmanager.rpc.port ++ echo 10.128.0.193:32803 ++ cut -d : -f1 + local yarn_application_master_host=10.128.0.193 ++ echo '[{"key":"blob.server.port","value":"41871"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist-1.15.0.jar"},{"key":"classloader.check-leaked-classloader","value":"False"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1670575344313_0002"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.memory.jvm-overhead.min","value":"611948962b"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1670575344313_0002"},{"key":"taskmanager.network.numberOfBuffers","value":"4096"},{"key":"parallelism.default","value":"8"},{"key":"taskmanager.numberOfTaskSlots","value":"2"},{"key":"env.hadoop.conf.dir","value":"/etc/hadoop/conf"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"5836"},{"key":"taskmanager.memory.process.size","value":"5836' 'mb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"5836"},{"key":"jobmanager.memory.off-heap.size","value":"134217728b"},{"key":"execution.target","value":"yarn-session"},{"key":"jobmanager.memory.process.size","value":"5836' 'mb"},{"key":"web.tmpdir","value":"/tmp/flink-web-712a957a-e616-4cc7-9558-3667975ac359"},{"key":"jobmanager.rpc.port","value":"43491"},{"key":"rest.bind-address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"rest.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"jobmanager.memory.jvm-metaspace.size","value":"268435456b"},{"key":"$internal.deployment.config-dir","value":"/usr/lib/flink/conf"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.memory.heap.size","value":"5104887390b"},{"key":"jobmanager.memory.jvm-overhead.max","value":"611948962b"}]' ++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])' + local jobmanager_rpc_port=43491 ++ [[ true == \t\r\u\e ]] ++ echo ' -Nf >& /dev/null' + local 'detached_mode_params= -Nf >& /dev/null' ++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]] ++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097' + local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m -- -L 8081:10.128.0.193:32803 -L 43491:10.128.0.193:43491 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null' + eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m -- -L 8081:10.128.0.193:32803 -L 43491:10.128.0.193:43491 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null ++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m -- -L 8081:10.128.0.193:32803 -L 43491:10.128.0.193:43491 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf [beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2278042159123416236.sh + echo '*** Combine Go Load test: 2GB of 10B records ***' *** Combine Go Load test: 2GB of 10B records *** [Gradle] - Launching build. [src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=combine -Prunner=FlinkRunner '-PloadTest.args=--job_name=load-tests-go-flink-batch-combine-1-1209065329 --influx_namespace=flink --influx_measurement=go_batch_combine_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size": 9}' --fanout=1 --top_count=20 --parallelism=5 --endpoint=localhost:8099 --environment_type=DOCKER --environment_config=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --runner=FlinkRunner' --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:go:test:load:run Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details Configuration on demand is an incubating feature. > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy FROM-CACHE > Task :buildSrc:pluginDescriptors > Task :buildSrc:processResources > Task :buildSrc:classes > Task :buildSrc:jar > Task :buildSrc:assemble > Task :buildSrc:spotlessGroovy FROM-CACHE > Task :buildSrc:spotlessGroovyCheck UP-TO-DATE > Task :buildSrc:spotlessGroovyGradle FROM-CACHE > Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE > Task :buildSrc:spotlessCheck UP-TO-DATE > Task :buildSrc:pluginUnderTestMetadata > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validatePlugins FROM-CACHE > Task :buildSrc:check UP-TO-DATE > Task :buildSrc:build System Go installation: /snap/bin/go is go version go1.16.15 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.19.3 go install golang.org/dl/go1.19.3@latest: no matching versions for query "latest" FAILURE: Build failed with an exception. * What went wrong: Could not determine the dependencies of task ':sdks:go:test:load:goBuild'. > Could not create task ':sdks:go:test:load:goPrepare'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org BUILD FAILED in 2m 11s 10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/4dlrc2bl2s5ww Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
