See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_SparkStructuredStreaming_Batch/98/display/redirect?page=changes>

Changes:

[noreply] Remove permitAll flag from seed & dependency check jenkins jobs 
(#12319)

[noreply] [BEAM-7390] Add combineperkey code snippets (#12277)

[noreply] Move more files to impl sub-directory (#12302)

[Robert Bradshaw] Update portability status and add some more documentation.

[noreply] [BEAM-10411] Adds an example that use Python cross-language Kafka

[noreply] [BEAM-10274] Fix translation of json pipeline options. (#12333)

[noreply] [BEAM-10545] Initialize an empty extension (#12327)

[Kenneth Knowles] Add analyzer-friendly checkArgumentNotNull

[Kenneth Knowles] Fix typo in error message in RowWithGetters

[Kenneth Knowles] Improve error message in ApiSurface tests

[Kenneth Knowles] Skip nullness analysis of AutoValue_ classes

[Kenneth Knowles] [BEAM-10547][BEAM-10548] Schema support for all sorts of 
Nullable and on

[Kenneth Knowles] Migrate to checkerframework nullness annotations

[Kenneth Knowles] [BEAM-10540] Fix nullability in equals methods globally

[noreply] [BEAM-10551] Implement Navigation Functions FIRST_VALUE and LAST_VALUE


------------------------------------------
[...truncated 59.83 KB...]
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :runners:direct-java:compileJava UP-TO-DATE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:java-job-service:compileJava UP-TO-DATE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar UP-TO-DATE
> Task :runners:direct-java:shadowJar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :runners:spark:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :runners:spark:classes UP-TO-DATE
> Task :runners:spark:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE

> Task :sdks:java:testing:load-tests:run
20/07/22 14:09:13 WARN org.apache.beam.sdk.Pipeline: The following transforms 
do not have stable unique names: Window.Into()
20/07/22 14:09:13 INFO 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner:
 *** SparkStructuredStreamingRunner is based on spark structured streaming 
framework and is no more 
 based on RDD/DStream API. See
 
https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
 It is still experimental, its coverage of the Beam model is partial. ***
20/07/22 14:09:14 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
20/07/22 14:09:15 INFO 
org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator: 
Instantiated metrics accumulator: {
  "metrics": {
  }
}
20/07/22 14:09:15 INFO 
org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator:
 Instantiated aggregators accumulator: 
20/07/22 14:10:03 ERROR org.apache.spark.executor.Executor: Exception in task 
0.0 in stage 0.0 (TID 0)
java.io.IOException: No space left on device
        at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
        at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
        at sun.nio.ch.IOUtil.write(IOUtil.java:51)
        at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
        at 
sun.nio.ch.FileChannelImpl.transferToTrustedChannel(FileChannelImpl.java:516)
        at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:609)
        at org.apache.spark.util.Utils$.copyFileStreamNIO(Utils.scala:389)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply$mcJ$sp(Utils.scala:354)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:348)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:348)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.util.Utils$.copyStream(Utils.scala:369)
        at org.apache.spark.util.Utils.copyStream(Utils.scala)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.writePartitionedFile(BypassMergeSortShuffleWriter.java:201)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:163)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
20/07/22 14:10:03 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 
in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: No space 
left on device
        at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
        at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
        at sun.nio.ch.IOUtil.write(IOUtil.java:51)
        at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
        at 
sun.nio.ch.FileChannelImpl.transferToTrustedChannel(FileChannelImpl.java:516)
        at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:609)
        at org.apache.spark.util.Utils$.copyFileStreamNIO(Utils.scala:389)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply$mcJ$sp(Utils.scala:354)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:348)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:348)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.util.Utils$.copyStream(Utils.scala:369)
        at org.apache.spark.util.Utils.copyStream(Utils.scala)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.writePartitionedFile(BypassMergeSortShuffleWriter.java:201)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:163)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

20/07/22 14:10:03 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in 
stage 0.0 failed 1 times; aborting job
20/07/22 14:10:03 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 4.0 
in stage 0.0 (TID 4, localhost, executor driver): TaskKilled (Stage cancelled)
20/07/22 14:10:03 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 1.0 
in stage 0.0 (TID 1, localhost, executor driver): TaskKilled (Stage cancelled)
20/07/22 14:10:04 ERROR org.apache.spark.storage.DiskBlockObjectWriter: 
Uncaught exception while reverting partial writes to file 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/16/temp_shuffle_f70a8b22-9f69-4a05-a2dc-a19a47890517
java.io.FileNotFoundException: 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/16/temp_shuffle_f70a8b22-9f69-4a05-a2dc-a19a47890517
 (No such file or directory)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at 
org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1369)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:105)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
20/07/22 14:10:04 ERROR 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
deleting file 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/16/temp_shuffle_f70a8b22-9f69-4a05-a2dc-a19a47890517
20/07/22 14:10:04 ERROR org.apache.spark.storage.DiskBlockObjectWriter: 
Uncaught exception while reverting partial writes to file 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/2c/temp_shuffle_d0592e8e-3667-43ff-8760-db22cdf0e2c6
java.io.FileNotFoundException: 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/2c/temp_shuffle_d0592e8e-3667-43ff-8760-db22cdf0e2c6
 (No such file or directory)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at 
org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1369)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:105)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
20/07/22 14:10:04 ERROR 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
deleting file 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/2c/temp_shuffle_d0592e8e-3667-43ff-8760-db22cdf0e2c6
20/07/22 14:10:04 ERROR org.apache.spark.storage.DiskBlockObjectWriter: 
Uncaught exception while reverting partial writes to file 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/0b/temp_shuffle_bc447ba1-ad54-4e46-a775-9bb4df320ce7
java.io.FileNotFoundException: 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/0b/temp_shuffle_bc447ba1-ad54-4e46-a775-9bb4df320ce7
 (No such file or directory)
        at java.io.FileOutputStream.open0(Native Method)
        at java.io.FileOutputStream.open(FileOutputStream.java:270)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
        at 
org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1369)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:105)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
20/07/22 14:10:04 ERROR 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
deleting file 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7/0b/temp_shuffle_bc447ba1-ad54-4e46-a775-9bb4df320ce7
20/07/22 14:10:04 WARN org.apache.spark.network.util.JavaUtils: Attempt to 
delete using native Unix OS command failed for path = 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7. Falling back to Java IO way
java.io.IOException: Failed to delete: 
/tmp/blockmgr-26d7ac1b-2fd7-4132-b36e-d2758cbbcbd7
        at 
org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingUnixNative(JavaUtils.java:171)
        at 
org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:110)
        at 
org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)
        at 
org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:178)
        at 
org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:174)
        at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at 
org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:174)
        at 
org.apache.spark.storage.DiskBlockManager.stop(DiskBlockManager.scala:169)
        at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1629)
        at org.apache.spark.SparkEnv.stop(SparkEnv.scala:90)
        at 
org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1974)
        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1973)
        at org.apache.spark.sql.SparkSession.stop(SparkSession.scala:713)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.stop(SparkStructuredStreamingPipelineResult.java:89)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.offerNewState(SparkStructuredStreamingPipelineResult.java:148)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.waitUntilFinish(SparkStructuredStreamingPipelineResult.java:123)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:127)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)
Exception in thread "main" 
org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: 
No space left on device
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.beamExceptionFrom(SparkStructuredStreamingPipelineResult.java:71)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.waitUntilFinish(SparkStructuredStreamingPipelineResult.java:124)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:127)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
        at 
org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)
Caused by: java.io.IOException: No space left on device
        at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
        at sun.nio.ch.FileDispatcherImpl.write(FileDispatcherImpl.java:60)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
        at sun.nio.ch.IOUtil.write(IOUtil.java:51)
        at sun.nio.ch.FileChannelImpl.write(FileChannelImpl.java:211)
        at 
sun.nio.ch.FileChannelImpl.transferToTrustedChannel(FileChannelImpl.java:516)
        at sun.nio.ch.FileChannelImpl.transferTo(FileChannelImpl.java:609)
        at org.apache.spark.util.Utils$.copyFileStreamNIO(Utils.scala:389)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply$mcJ$sp(Utils.scala:354)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:348)
        at 
org.apache.spark.util.Utils$$anonfun$copyStream$1.apply(Utils.scala:348)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.util.Utils$.copyStream(Utils.scala:369)
        at org.apache.spark.util.Utils.copyStream(Utils.scala)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.writePartitionedFile(BypassMergeSortShuffleWriter.java:201)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:163)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
        at org.apache.spark.scheduler.Task.run(Task.scala:123)
        at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57s
66 actionable tasks: 1 executed, 65 up-to-date

Publishing build scan...
https://gradle.com/s/a5eeforn5rlje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to