See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch/960/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Exclude IOs already split from Java Precommit job

[Kenneth Knowles] Move expansion services into appropriate precommits

[Kenneth Knowles] Split more IOs out of Java precommit

[Kenneth Knowles] Fix trigger paths for separated IOs

[Kenneth Knowles] Turn rawtype checking back on for core Java SDK

[noreply] [Tour Of Beam] Playground Router GRPC API host (#24542)

[noreply] Bump golang.org/x/net from 0.3.0 to 0.4.0 in /sdks (#24587)

[noreply] Replaced finalize with DoFn Teardown in Neo4jIO (#24571)

[Kenneth Knowles] Simplify bug report templates

[Kenneth Knowles] Fix bugs in issue template yml

[noreply] Fix issue templates (#24597)

[noreply] [#24024] Stop wrapping light weight functions with Contextful as they

[noreply] Sample window size as well (#24388)

[noreply] Implement Kafka Write Schema Transform (#24495)

[Kenneth Knowles] Eliminate null errors from JdbcIO

[noreply] docs(fix): Filter.whereFieldName(s?) ->

[hiandyzhang] ElasticsearchIO: Lower log level in flushBatch to avoid noisy log


------------------------------------------
[...truncated 122.85 KB...]
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :runners:java-job-service:compileJava UP-TO-DATE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:spark:3:compileJava UP-TO-DATE
> Task :runners:spark:3:classes UP-TO-DATE
> Task :runners:spark:3:jar UP-TO-DATE

> Task :sdks:java:testing:load-tests:run
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.30/c21f55139d8141d2231214fb1feaf50a1edca95e/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-reload4j/1.7.36/db708f7d959dee1857ac524636e85ecf2e1781c1/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
22/12/09 12:37:41 WARN org.apache.beam.sdk.Pipeline: The following transforms 
do not have stable unique names: Collect end time metric
22/12/09 12:37:41 INFO 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner:
 *** SparkStructuredStreamingRunner is based on spark structured streaming 
framework and is no more 
 based on RDD/DStream API. See
 
https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
 It is still experimental, its coverage of the Beam model is partial. ***
22/12/09 12:37:41 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.SparkSessionFactory:
 Configured `spark.serializer` to use KryoSerializer [unsafe=true]
22/12/09 12:37:42 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
22/12/09 12:37:43 INFO org.sparkproject.jetty.util.log: Logging initialized 
@6400ms to org.sparkproject.jetty.util.log.Slf4jLog
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.Server: 
jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git: 
b881a572662e1943a14ae12e7e1207989f218b74; jvm 1.8.0_352-8u352-ga-1~20.04-b08
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.Server: Started @6568ms
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.AbstractConnector: Started 
ServerConnector@529d2998{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@290e8cab{/jobs,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@470d183{/jobs/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@ea52184{/jobs/job,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@3ddeaa5f{/jobs/job/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@7f608e21{/stages,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@210d2a6c{/stages/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@4086d8fb{/stages/stage,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@176555c{/stages/stage/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@795f8317{/stages/pool,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@355c94be{/stages/pool/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@c386958{/storage,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@44d64d4e{/storage/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@526a9908{/storage/rdd,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@47ac613b{/storage/rdd/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@66f28a1f{/environment,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@44a085e5{/environment/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@619f2afc{/executors,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@4db60246{/executors/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@3902bd2c{/executors/threadDump,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@eb6ec6{/executors/threadDump/json,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@18137eab{/static,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@73971965{/,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@17410c07{/api,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@64f981e2{/jobs/job/kill,null,AVAILABLE,@Spark}
22/12/09 12:37:43 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@575b5f7d{/stages/stage/kill,null,AVAILABLE,@Spark}
22/12/09 12:37:44 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@493da830{/metrics/json,null,AVAILABLE,@Spark}
22/12/09 12:37:44 INFO 
org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator: 
Instantiated metrics accumulator: MetricQueryResults()
22/12/09 12:37:44 INFO 
org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator:
 Instantiated aggregators accumulator: 
22/12/09 12:37:46 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Read input
22/12/09 12:37:50 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@63ed2024{/SQL,null,AVAILABLE,@Spark}
22/12/09 12:37:50 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started o.s.j.s.ServletContextHandler@7c2f2db9{/SQL/json,null,AVAILABLE,@Spark}
22/12/09 12:37:50 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@5657d231{/SQL/execution,null,AVAILABLE,@Spark}
22/12/09 12:37:50 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@559b2eb9{/SQL/execution/json,null,AVAILABLE,@Spark}
22/12/09 12:37:50 INFO org.sparkproject.jetty.server.handler.ContextHandler: 
Started 
o.s.j.s.ServletContextHandler@5d1f9907{/static/sql,null,AVAILABLE,@Spark}
22/12/09 12:37:55 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect start time metric/ParMultiDo(TimeMonitor)
22/12/09 12:37:55 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect metrics/ParMultiDo(ByteMonitor)
22/12/09 12:37:55 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Window.Into()/Window.Assign
22/12/09 12:37:55 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Dataset Window.Into()/Window.Assign.out will be cached.
22/12/09 12:37:57 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 0/Map/ParMultiDo(Anonymous)
22/12/09 12:37:58 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 0
22/12/09 12:37:59 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric/ParMultiDo(TimeMonitor)
22/12/09 12:37:59 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 1/Map/ParMultiDo(Anonymous)
22/12/09 12:37:59 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 1
22/12/09 12:37:59 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric2/ParMultiDo(TimeMonitor)
22/12/09 12:38:00 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 2/Map/ParMultiDo(Anonymous)
22/12/09 12:38:00 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 2
22/12/09 12:38:00 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric3/ParMultiDo(TimeMonitor)
22/12/09 12:38:00 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 3/Map/ParMultiDo(Anonymous)
22/12/09 12:38:00 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 3
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric4/ParMultiDo(TimeMonitor)
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 4/Map/ParMultiDo(Anonymous)
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 4
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric5/ParMultiDo(TimeMonitor)
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 5/Map/ParMultiDo(Anonymous)
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 5
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric6/ParMultiDo(TimeMonitor)
22/12/09 12:38:01 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 6/Map/ParMultiDo(Anonymous)
22/12/09 12:38:02 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 6
22/12/09 12:38:02 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric7/ParMultiDo(TimeMonitor)
22/12/09 12:38:02 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Convert to Long: 7/Map/ParMultiDo(Anonymous)
22/12/09 12:38:02 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating composite: Combine: 7
22/12/09 12:38:02 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.PipelineTranslator:
 Translating primitive: Collect end time metric8/ParMultiDo(TimeMonitor)
22/12/09 12:38:29 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext:
 Evaluated dataset Collect end time metric/ParMultiDo(TimeMonitor).output in 
26.4 s
22/12/09 12:38:37 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext:
 Evaluated dataset Collect end time metric6/ParMultiDo(TimeMonitor).output in 
8.5 s
22/12/09 12:38:48 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext:
 Evaluated dataset Collect end time metric2/ParMultiDo(TimeMonitor).output in 
10.6 s
22/12/09 12:38:58 INFO 
org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext:
 Evaluated dataset Collect end time metric3/ParMultiDo(TimeMonitor).output in 
9.8 s
22/12/09 12:38:58 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped 
Spark@529d2998{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
Exception in thread "main" java.lang.RuntimeException: 
org.apache.spark.SparkException: Job 4 cancelled because SparkContext was shut 
down
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.runtimeExceptionFrom(SparkStructuredStreamingPipelineResult.java:57)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.beamExceptionFrom(SparkStructuredStreamingPipelineResult.java:74)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.waitUntilFinish(SparkStructuredStreamingPipelineResult.java:104)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:132)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:172)
Caused by: org.apache.spark.SparkException: Job 4 cancelled because 
SparkContext was shut down
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$cleanUpAfterSchedulerStop$1(DAGScheduler.scala:1085)
        at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$cleanUpAfterSchedulerStop$1$adapted(DAGScheduler.scala:1083)
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
        at 
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:1083)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2463)
        at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
        at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2369)
        at 
org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2069)
        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1419)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:2069)
        at org.apache.spark.SparkContext.$anonfun$new$37(SparkContext.scala:661)
        at 
org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
        at 
org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
        at 
org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.util.Try$.apply(Try.scala:213)
        at 
org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:868)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2196)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2217)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2236)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2261)
        at org.apache.spark.rdd.RDD.$anonfun$foreach$1(RDD.scala:1012)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
        at org.apache.spark.rdd.RDD.foreach(RDD.scala:1010)
        at org.apache.spark.sql.Dataset.$anonfun$foreach$1(Dataset.scala:2887)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at 
org.apache.spark.sql.Dataset.$anonfun$withNewRDDExecutionId$1(Dataset.scala:3676)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
        at 
org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3674)
        at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2887)
        at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2897)
        at 
org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext.evaluate(EvaluationContext.java:86)
        at 
org.apache.beam.runners.spark.structuredstreaming.translation.EvaluationContext.evaluate(EvaluationContext.java:74)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner.lambda$run$0(SparkStructuredStreamingRunner.java:163)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 143

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 36s
92 actionable tasks: 1 executed, 91 up-to-date

Publishing build scan...
https://gradle.com/s/suinhbglbmuvs

The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=7e7e8f1b-a679-44f2-97d2-702e8befc563, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 3271861
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-3271861.out.log
----- Last  20 lines from daemon log file - daemon-3271861.out.log -----

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 36s
92 actionable tasks: 1 executed, 91 up-to-date

Publishing build scan...
https://gradle.com/s/suinhbglbmuvs

Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to