See <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch/141/display/redirect?page=changes>
Changes: [samuelw] [BEAM-10808] Health checks streaming rpcs to streaming engine backend [srohde] Gracefully shutdown the channel reader in the test_stream_impl [ningk] [BEAM-10545] Added the inspector of PCollections and pipelines [ningk] Simplified some of the code based on comments [ningk] Fixed lint issues. [srohde] Add CANCELLED to non error codes for test stream events from grpc [Thomas Weise] [BEAM-10760] Optimize state cleanup for global window in portable Flink [noreply] [BEAM-10773] Use the image flag before the default environment. (#12697) [noreply] [BEAM-10807] Add scheduled mail with metrics report (#12685) [Boyuan Zhang] Refactor split logic to reuse common logic. [noreply] [BEAM-10567] Add LICENSE for pbr (#12765) ------------------------------------------ [...truncated 1.24 KB...] > git fetch --tags --progress https://github.com/apache/beam.git > +refs/heads/*:refs/remotes/origin/* > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # > timeout=10 > git rev-parse origin/master^{commit} # timeout=10 Checking out Revision d48a4d6294a0a5db5ceae3ba79376abe412d0103 (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f d48a4d6294a0a5db5ceae3ba79376abe412d0103 # timeout=10 Commit message: "[BEAM-10567] Add LICENSE for pbr (#12765)" > git rev-list --no-walk ca34a21c246ce41dca20a2c2276cfcb1fb45e119 # timeout=10 No emails were triggered. [EnvInject] - Executing scripts and injecting environment variables after the SCM step. [EnvInject] - Injecting as environment variables the properties content SPARK_LOCAL_IP=127.0.0.1 SETUPTOOLS_USE_DISTUTILS=stdlib [EnvInject] - Variables injected successfully. [beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch] $ /bin/bash -xe /tmp/jenkins5825178776194861477.sh + echo '*** Load test: 2GB of 10B records ***' *** Load test: 2GB of 10B records *** [Gradle] - Launching build. [src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Java_Combine_SparkStructuredStreaming_Batch/ws/src/gradlew> -PloadTest.mainClass=org.apache.beam.sdk.loadtests.CombineLoadTest -Prunner=:runners:spark -PwithDataflowWorkerJar=false '-PloadTest.args=--project=apache-beam-testing --appName=load_tests_Java_SparkStructuredStreaming_batch_Combine_1 --tempLocation=gs://temp-storage-for-perf-tests/loadtests --publishToBigQuery=true --bigQueryDataset=load_test --bigQueryTable=java_sparkstructuredstreaming_batch_Combine_1 --influxMeasurement=java_batch_combine_1 --publishToInfluxDB=true --sourceOptions={"numRecords":200000000,"keySizeBytes":1,"valueSizeBytes":9} --fanout=1 --iterations=1 --topCount=20 --perKeyCombiner=TOP_LARGEST --streaming=false --influxDatabase=beam_test_metrics --influxHost=http://10.128.0.96:8086 --runner=SparkStructuredStreamingRunner' --continue --max-****s=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g -Pdocker-pull-licenses :sdks:java:testing:load-tests:run > Task :buildSrc:compileJava NO-SOURCE > Task :buildSrc:compileGroovy FROM-CACHE > Task :buildSrc:pluginDescriptors > Task :buildSrc:processResources > Task :buildSrc:classes > Task :buildSrc:jar > Task :buildSrc:assemble > Task :buildSrc:spotlessGroovy > Task :buildSrc:spotlessGroovyCheck > Task :buildSrc:spotlessGroovyGradle > Task :buildSrc:spotlessGroovyGradleCheck > Task :buildSrc:spotlessCheck > Task :buildSrc:pluginUnderTestMetadata > Task :buildSrc:compileTestJava NO-SOURCE > Task :buildSrc:compileTestGroovy NO-SOURCE > Task :buildSrc:processTestResources NO-SOURCE > Task :buildSrc:testClasses UP-TO-DATE > Task :buildSrc:test NO-SOURCE > Task :buildSrc:validateTaskProperties FROM-CACHE > Task :buildSrc:check > Task :buildSrc:build Configuration on demand is an incubating feature. > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE > Task :runners:local-java:processResources NO-SOURCE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :runners:direct-java:processResources NO-SOURCE > Task :runners:java-job-service:processResources NO-SOURCE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task :sdks:java:io:kinesis:processResources NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :model:fn-execution:extractProto > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :sdks:java:expansion-service:createCheckerFrameworkManifest > Task :model:job-management:extractProto > Task :sdks:java:extensions:protobuf:extractProto > Task :runners:spark:processResources > Task :sdks:java:expansion-service:processResources NO-SOURCE > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :model:job-management:processResources > Task :model:fn-execution:processResources > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :sdks:java:core:processResources > Task :model:pipeline:extractIncludeProto > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:generateProto > Task :model:fn-execution:generateProto > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:pipeline:shadowJar > Task :model:job-management:shadowJar > Task :model:fn-execution:shadowJar > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar > Task :runners:local-java:compileJava FROM-CACHE > Task :runners:local-java:classes UP-TO-DATE > Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE > Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE > Task :runners:local-java:jar > Task :sdks:java:io:synthetic:compileJava FROM-CACHE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava FROM-CACHE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:io:synthetic:jar > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:io:kinesis:jar > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:jar > Task :sdks:java:fn-execution:jar > Task :vendor:sdks-java-extensions-protobuf:shadowJar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :sdks:java:extensions:protobuf:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:harness:shadowJar > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava FROM-CACHE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar > Task :runners:java-job-service:compileJava FROM-CACHE > Task :runners:java-job-service:classes UP-TO-DATE > Task :runners:direct-java:compileJava FROM-CACHE > Task :runners:direct-java:classes UP-TO-DATE > Task :runners:java-job-service:jar > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :runners:spark:compileJava FROM-CACHE > Task :runners:spark:classes > Task :runners:spark:jar > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:direct-java:shadowJar > Task :sdks:java:testing:load-tests:compileJava FROM-CACHE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar > Task :sdks:java:testing:load-tests:run 20/09/03 12:30:48 INFO org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner: *** SparkStructuredStreamingRunner is based on spark structured streaming framework and is no more based on RDD/DStream API. See https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html It is still experimental, its coverage of the Beam model is partial. *** 20/09/03 12:30:49 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 20/09/03 12:30:50 INFO org.apache.beam.runners.spark.structuredstreaming.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults() 20/09/03 12:30:50 INFO org.apache.beam.runners.spark.structuredstreaming.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator: Exception in thread "main" java.lang.RuntimeException: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down at org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.runtimeExceptionFrom(SparkStructuredStreamingPipelineResult.java:58) at org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.beamExceptionFrom(SparkStructuredStreamingPipelineResult.java:75) at org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingPipelineResult.waitUntilFinish(SparkStructuredStreamingPipelineResult.java:124) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:127) at org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66) at org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169) Caused by: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:933) at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:931) at scala.collection.mutable.HashSet.foreach(HashSet.scala:78) at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:931) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2130) at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84) at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2043) at org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340) at org.apache.spark.SparkContext.stop(SparkContext.scala:1948) at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:738) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126) at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:972) at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:970) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:385) at org.apache.spark.rdd.RDD.foreach(RDD.scala:970) at org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply$mcV$sp(Dataset.scala:2722) at org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply(Dataset.scala:2722) at org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply(Dataset.scala:2722) at org.apache.spark.sql.Dataset$$anonfun$withNewRDDExecutionId$1.apply(Dataset.scala:3355) at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75) at org.apache.spark.sql.Dataset.withNewRDDExecutionId(Dataset.scala:3351) at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2721) at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2732) at org.apache.beam.runners.spark.structuredstreaming.translation.TranslationContext.startPipeline(TranslationContext.java:224) at org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner.lambda$run$0(SparkStructuredStreamingRunner.java:155) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 143 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 7m 48s 67 actionable tasks: 43 executed, 24 from cache Publishing build scan... https://gradle.com/s/vdapopmepxo7a Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
