See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch/409/display/redirect?page=changes>

Changes:

[Ismaël Mejía] [BEAM-12277] Update Flink 1.13 version to 1.13.1

[Ismaël Mejía] [BEAM-12424] Update Flink 1.12 to version 1.12.4


------------------------------------------
[...truncated 9.65 KB...]
> Task :sdks:java:harness:createCheckerFrameworkManifest
> Task :runners:core-construction-java:createCheckerFrameworkManifest
> Task :model:pipeline:createCheckerFrameworkManifest
> Task :model:fn-execution:createCheckerFrameworkManifest
> Task :model:job-management:createCheckerFrameworkManifest
> Task :sdks:java:core:createCheckerFrameworkManifest
> Task :runners:java-job-service:createCheckerFrameworkManifest
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :runners:spark:2:copyResourcesOverrides NO-SOURCE
> Task :runners:java-job-service:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest
> Task :sdks:java:io:kafka:createCheckerFrameworkManifest
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :model:fn-execution:extractProto
> Task :model:job-management:extractProto
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:io:kinesis:createCheckerFrameworkManifest
> Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :model:fn-execution:processResources
> Task :sdks:java:core:processResources
> Task :runners:spark:2:copySourceOverrides
> Task :runners:spark:2:copyTestResourcesOverrides NO-SOURCE
> Task :runners:spark:2:createCheckerFrameworkManifest
> Task :runners:spark:2:processResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :model:pipeline:generateProto FROM-CACHE
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:java-job-service:compileJava FROM-CACHE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:spark:2:compileJava FROM-CACHE
> Task :runners:spark:2:classes
> Task :runners:spark:2:jar
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
21/05/31 12:30:34 INFO 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner:
 *** SparkStructuredStreamingRunner is based on spark structured streaming 
framework and is no more 
 based on RDD/DStream API. See
 
https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
 It is still experimental, its coverage of the Beam model is partial. ***
21/05/31 12:30:35 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
21/05/31 12:30:36 INFO 
org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator: 
Instantiated metrics accumulator: MetricQueryResults()
21/05/31 12:30:36 INFO 
org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator:
 Instantiated aggregators accumulator: 
Exception in thread "main" java.lang.RuntimeException: 
org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut 
down
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.runtimeExceptionFrom(SparkStructuredStreamingPipelineResult.java:62)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.beamExceptionFrom(SparkStructuredStreamingPipelineResult.java:79)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.waitUntilFinish(SparkStructuredStreamingPipelineResult.java:128)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:130)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:172)
Caused by: org.apache.spark.SparkException: Job 0 cancelled because 
SparkContext was shut down
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:954)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:952)
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
        at 
org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:952)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2164)
        at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
        at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2077)
        at 
org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949)
        at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
        at org.apache.spark.SparkContext.stop(SparkContext.scala:1948)
        at 
org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
        at 
org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at 
org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at 
org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:759)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2067)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2088)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2107)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2132)
        at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:972)
        at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:970)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
        at org.apache.spark.rdd.RDD.foreach(RDD.scala:970)
        at 
org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply$mcV$sp(Dataset.scala:2722)
        at 
org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply(Dataset.scala:2722)
        at 
org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply(Dataset.scala:2722)
        at 
org.apache.spark.sql.Dataset$$anonfun$withNewRDDExecutionId$1.apply(Dataset.scala:3354)
        at 
org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
        at 
org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3350)
        at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2721)
        at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2732)
        at 
org.apache.beam.runners.spark.structuredstreaming.translation.TranslationContext.startPipeline(TranslationContext.java:228)
        at 
org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner.lambda$run$0(SparkStructuredStreamingRunner.java:145)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 143

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 16s
81 actionable tasks: 52 executed, 29 from cache

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=3fe5a509-0302-4e3a-b06f-6021b61ab563, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 3867
  log file: /home/jenkins/.gradle/daemon/6.8.3/daemon-3867.out.log
----- Last  20 lines from daemon log file - daemon-3867.out.log -----
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 143

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 16s
81 actionable tasks: 52 executed, 29 from cache

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to