See 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/8/display/redirect?page=changes>

Changes:

[chadrik] Add attributes defined in operations.pxd but missing in operations.py

[robertwb] Minor FnAPI proto cleanups.

[je.ik] [BEAM-9273] Explicitly disable @RequiresTimeSortedInput on unsupported

[je.ik] [BEAM-9273] code review - to be squashed

[kcweaver] [BEAM-9212] fix zetasql struct exception

[kcweaver] [BEAM-9211] Spark reuse Flink portable jar test script

[kcweaver] test_pipeline_jar Use single jar arg for both Flink and Spark.

[iemejia] Pin Avro dependency in Python SDK to be consistent with Avro 
versioning

[apilloud] [BEAM-9311] ZetaSQL Named Parameters are case-insensitive

[github] Bump dataflow container version (#10861)

[github] [BEAM-8335] Update StreamingCache with new Protos (#10856)

[github] [BEAM-9317] Fix portable test executions to specify the beam_fn_api

[je.ik] [BEAM-9265] @RequiresTimeSortedInput respects allowedLateness

[github] [BEAM-9289] Improve performance for metrics update of samza runner

[github] = instead of -eq

[iemejia] [BEAM-6857] Classify unbounded dynamic timers tests in the

[iemejia] Exclude Unbounded PCollection tests from Flink Portable runner batch

[github] [BEAM-9317] Fix Dataflow tests to not perform SplittableDoFn expansion

[iemejia] [BEAM-9315] Allow multiple paths via HADOOP_CONF_DIR in

[github] Update container images used by Dataflow runner with unreleased SDKs.

[github] [BEAM-9314] Make dot output deterministic (#10864)

[ccy] [BEAM-9277] Fix exception when running in IPython notebook.

[github] Remove experimental parallelization (-j 8) flags from sphinx

[iemejia] [BEAM-9301] Checkout the hash of master instead of the branch in beam

[github] [BEAM-8399] Add --hdfs_full_urls option (#10223)

[iemejia] Fix typo on runners/extensions-java label for github PR autolabeler

[github] Merge pull request #10862: [BEAM-9320] Add AlwaysFetched annotation


------------------------------------------
[...truncated 75.40 KB...]
Waiting for cluster creation operation...
WARNING: For PD-Standard without local SSDs, we strongly recommend provisioning 
1TB or larger to ensure consistently high I/O performance. See 
https://cloud.google.com/compute/docs/disks/performance for information on disk 
I/O performance.
.......................................................................................................................................................done.
Created 
[https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/global/clusters/beam-loadtests-java-portable-flink-batch-8]
 Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-java-portable-flink-batch-8-m '--command=yarn application 
-list'
++ grep beam-loadtests-java-portable-flink-batch-8
Warning: Permanently added 'compute.8564509543116519414' (ECDSA) to the list of 
known hosts.
20/02/17 12:40:18 INFO client.RMProxy: Connecting to ResourceManager at 
beam-loadtests-java-portable-flink-batch-8-m/10.128.0.19:8032
+ read line
+ echo application_1581943162552_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
application_1581943162552_0001 flink-dataproc Apache Flink yarn default RUNNING 
UNDEFINED 100% 
http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ echo application_1581943162552_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ sed 's/ .*//'
+ application_ids[$i]=application_1581943162552_0001
++ echo application_1581943162552_0001 flink-dataproc Apache Flink yarn default 
RUNNING UNDEFINED 100% 
http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ sed 
's/.*beam-loadtests-java-portable-flink-batch-8/beam-loadtests-java-portable-flink-batch-8/'
++ sed 's/ .*//'
+ 
application_masters[$i]=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ 
YARN_APPLICATION_MASTER=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
+ echo 'Using Yarn Application master: 
beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449'
Using Yarn Application master: 
beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
+ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-java-portable-flink-batch-8-m '--command=sudo --user yarn 
docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 
--volume ~/.config/gcloud:/root/.config/gcloud 
gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
--flink-master=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
 
--artifacts-dir=gs://beam-flink-cluster/beam-loadtests-java-portable-flink-batch-8'
38504cbe34588d820e4f9ffd8fa64e1cccde09f57c964e1d68f700d308074f4e
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a 
yarn@beam-loadtests-java-portable-flink-batch-8-m '--command=curl -s 
"http://beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449/jobmanager/config";'
+ local 
'job_server_config=[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581943162552_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-be019c21-1ebe-4485-91f0-3282c77c44ab"},{"key":"jobmanager.rpc.port","value":"40523"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581943162552_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local key=jobmanager.rpc.port
++ echo 
beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
++ cut -d : -f1
+ local 
yarn_application_master_host=beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal
++ python -c 'import sys, json; print([e['\''value'\''] for e in 
json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
++ echo 
'[{"key":"web.port","value":"0"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1581943162552_0001"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"jobmanager.heap.mb","value":"12288"},{"key":"FLINK_PLUGINS_DIR","value":"/usr/lib/flink/plugins"},{"key":"web.tmpdir","value":"/tmp/flink-web-be019c21-1ebe-4485-91f0-3282c77c44ab"},{"key":"jobmanager.rpc.port","value":"40523"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/yarn/appcache/application_1581943162552_0001"},{"key":"taskmanager.network.numberOfBuffers","value":"2048"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"parallelism.default","value":"5"},{"key":"taskmanager.numberOfTaskSlots","value":"1"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"rest.address","value":"beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal"},{"key":"taskmanager.heap.mb","value":"12288"},{"key":"taskmanager.heap.size","value":"12288m"},{"key":"jobmanager.heap.size","value":"12288m"}]'
+ local jobmanager_rpc_port=40523
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/flink1.9_job_server:latest 
]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 
8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-java-portable-flink-batch-8-m -- -L 
8081:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
 -L 
40523:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:40523
 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080  
-Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-java-portable-flink-batch-8-m -- -L 
8081:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
 -L 
40523:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:40523
 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 
-Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet 
yarn@beam-loadtests-java-portable-flink-batch-8-m -- -L 
8081:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:35449
 -L 
40523:beam-loadtests-java-portable-flink-batch-8-w-1.c.apache-beam-testing.internal:40523
 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 
-Nf
[beam_LoadTests_Java_Combine_Portable_Flink_Batch] $ /bin/bash -xe 
/tmp/jenkins5886835649746084439.sh
+ echo src Load test: 2GB of 10B records on Flink in Portable mode src
src Load test: 2GB of 10B records on Flink in Portable mode src
[Gradle] - Launching build.
[src] $ 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/gradlew>
 -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest 
-Prunner=:runners:portability:java 
'-PloadTest.args=--project=apache-beam-testing 
--appName=load_tests_Java_Portable_Flink_batch_Combine_1 
--tempLocation=gs://temp-storage-for-perf-tests/loadtests 
--publishToBigQuery=true --bigQueryDataset=load_test 
--bigQueryTable=java_portable_flink_batch_Combine_1 
--sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} 
--fanout=1 --iterations=1 --topCount=20 --sdkWorkerParallelism=5 
--perKeyCombiner=TOP_LARGEST --streaming=false --jobEndpoint=localhost:8099 
--defaultEnvironmentConfig=gcr.io/apache-beam-testing/beam_portability/java_sdk:latest
 --defaultEnvironmentType=DOCKER --runner=PortableRunner' --continue 
--max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g 
:sdks:java:testing:load-tests:run
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:portability:java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:local-java:jar
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :runners:portability:java:compileJava FROM-CACHE
> Task :runners:portability:java:classes UP-TO-DATE
> Task :runners:portability:java:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :runners:direct-java:shadowJar

> Task :sdks:java:io:synthetic:compileJava
Note: 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Portable_Flink_Batch/ws/src/sdks/java/io/synthetic/src/main/java/org/apache/beam/sdk/io/synthetic/SyntheticBoundedSource.java>
 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

> Task :sdks:java:io:synthetic:classes
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:io:kinesis:compileJava
> Task :sdks:java:io:kinesis:classes
> Task :sdks:java:io:kinesis:jar

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
ERROR StatusLogger Log4j2 could not find a logging implementation. Please add 
log4j-core to the classpath. Using SimpleLogger to log to the console...
Exception in thread "main" java.lang.RuntimeException: 
java.util.concurrent.ExecutionException: java.lang.RuntimeException: The Runner 
experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes 
to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 
primitives, but transform Read input executes in environment Optional[urn: 
"beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
        at 
org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:98)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:99)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: 
The Runner experienced the following error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes 
to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 
primitives, but transform Read input executes in environment Optional[urn: 
"beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1928)
        at 
org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:90)
        ... 3 more
Caused by: java.lang.RuntimeException: The Runner experienced the following 
error during execution:
java.lang.IllegalArgumentException: GreedyPipelineFuser requires all root nodes 
to be runner-implemented beam:transform:impulse:v1 or beam:transform:read:v1 
primitives, but transform Read input executes in environment Optional[urn: 
"beam:env:docker:v1"
payload: "\n;gcr.io/apache-beam-testing/beam_portability/java_sdk:latest"
]
        at 
org.apache.beam.runners.portability.JobServicePipelineResult.propagateErrors(JobServicePipelineResult.java:165)
        at 
org.apache.beam.runners.portability.JobServicePipelineResult.waitUntilFinish(JobServicePipelineResult.java:110)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
        at 
java.util.concurrent.CompletableFuture$AsyncSupply.exec(CompletableFuture.java:1596)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 30s
61 actionable tasks: 11 executed, 4 from cache, 46 up-to-date

Publishing build scan...
https://gradle.com/s/wk2sl6swex6u6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to