See 
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/114/display/redirect>

------------------------------------------
[...truncated 1.85 MB...]
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 22.0 (TID 104) in 141 ms on localhost 
(executor driver) (3/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 22.0 (TID 107). 13259 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 22.0 (TID 107) in 144 ms on localhost 
(executor driver) (4/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 22.0, whose tasks have all completed, from pool 
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 22 (repartition at GroupCombineFunctions.java:242) 
finished in 0.114 s
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 23)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 23 (MapPartitionsRDD[186] at map at 
TranslationUtils.java:129), which has no missing parents
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_10 stored as values in memory (estimated size 82.6 KB, 
free 1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_10_piece0 stored as bytes in memory (estimated size 21.9 
KB, free 1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_10_piece0 in memory on 127.0.0.1:38620 (size: 21.9 KB, 
free: 1818.0 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 10 from broadcast at DAGScheduler.scala:1006
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 23 (MapPartitionsRDD[186] at 
map at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 
1, 2, 3))
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 23.0 with 4 tasks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 23.0 (TID 108, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 23.0 (TID 109, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 23.0 (TID 110, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 23.0 (TID 111, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 23.0 (TID 109)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 23.0 (TID 110)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 23.0 (TID 111)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 23.0 (TID 108)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_183_2 stored as bytes in memory (estimated size 4.0 B, free 
1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_183_2 in memory on 127.0.0.1:38620 (size: 4.0 B, free: 1818.0 
MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 23.0 (TID 110). 13025 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 23.0 (TID 110) in 19 ms on localhost (executor 
driver) (1/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_183_1 stored as bytes in memory (estimated size 4.0 B, free 
1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_183_3 stored as bytes in memory (estimated size 4.0 B, free 
1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_183_3 in memory on 127.0.0.1:38620 (size: 4.0 B, free: 1818.0 
MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_183_1 in memory on 127.0.0.1:38620 (size: 4.0 B, free: 1818.0 
MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks out of 4 blocks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 23.0 (TID 111). 13025 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block rdd_183_0 stored as bytes in memory (estimated size 4.0 B, free 
1817.2 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 23.0 (TID 109). 13025 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 23.0 (TID 111) in 29 ms on localhost (executor 
driver) (2/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added rdd_183_0 in memory on 127.0.0.1:38620 (size: 4.0 B, free: 1818.0 
MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 23.0 (TID 109) in 33 ms on localhost (executor 
driver) (3/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 23.0 (TID 108). 13025 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 23.0 (TID 108) in 37 ms on localhost (executor 
driver) (4/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 23.0, whose tasks have all completed, from pool 
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 23 (foreach at UnboundedDataset.java:81) finished in 0.034 s
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 4 finished: foreach at UnboundedDataset.java:81, took 0.488617 s
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished job streaming job 1520481191000 ms.2 from job set of time 
1520481191000 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job streaming job 1520481191000 ms.3 from job set of time 
1520481191000 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job: foreach at UnboundedDataset.java:81
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Size of output statuses for shuffle 1 is 83 bytes
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Size of output statuses for shuffle 0 is 83 bytes
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Size of output statuses for shuffle 7 is 193 bytes
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Size of output statuses for shuffle 6 is 195 bytes
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Size of output statuses for shuffle 5 is 152 bytes
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Got job 5 (foreach at UnboundedDataset.java:81) with 4 output partitions
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Final stage: ResultStage 29 (foreach at UnboundedDataset.java:81)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Parents of final stage: List(ShuffleMapStage 28)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Missing parents: List()
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 29 (MapPartitionsRDD[188] at map at 
TranslationUtils.java:129), which has no missing parents
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_11 stored as values in memory (estimated size 82.6 KB, 
free 1817.1 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_11_piece0 stored as bytes in memory (estimated size 21.9 
KB, free 1817.1 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_11_piece0 in memory on 127.0.0.1:38620 (size: 21.9 KB, 
free: 1817.9 MB)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 11 from broadcast at DAGScheduler.scala:1006
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 29 (MapPartitionsRDD[188] at 
map at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 
1, 2, 3))
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 29.0 with 4 tasks
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 29.0 (TID 112, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 29.0 (TID 113, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 29.0 (TID 114, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 29.0 (TID 115, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 4897 bytes)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 29.0 (TID 114)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 29.0 (TID 112)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 29.0 (TID 115)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 29.0 (TID 113)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Found block rdd_183_2 locally
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Found block rdd_183_3 locally
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 29.0 (TID 114). 12129 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 29.0 (TID 114) in 17 ms on localhost (executor 
driver) (1/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Found block rdd_183_1 locally
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 29.0 (TID 113). 12129 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 29.0 (TID 113) in 21 ms on localhost (executor 
driver) (2/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 29.0 (TID 115). 12129 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Found block rdd_183_0 locally
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 29.0 (TID 115) in 23 ms on localhost (executor 
driver) (3/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 29.0 (TID 112). 12129 bytes result sent to 
driver
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 29.0 (TID 112) in 28 ms on localhost (executor 
driver) (4/4)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 29.0, whose tasks have all completed, from pool 
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 29 (foreach at UnboundedDataset.java:81) finished in 0.026 s
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 5 finished: foreach at UnboundedDataset.java:81, took 0.056195 s
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished job streaming job 1520481191000 ms.3 from job set of time 
1520481191000 ms
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Total delay: 23.185 s for time 1520481191000 ms (execution: 2.139 s)
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped JobScheduler
Mar 08, 2018 3:53:34 AM org.spark_project.jetty.server.handler.ContextHandler 
doStop
INFO: Stopped 
o.s.j.s.ServletContextHandler@63518a57{/streaming,null,UNAVAILABLE,@Spark}
Mar 08, 2018 3:53:34 AM org.spark_project.jetty.server.handler.ContextHandler 
doStop
INFO: Stopped 
o.s.j.s.ServletContextHandler@6e1786f7{/streaming/batch,null,UNAVAILABLE,@Spark}
Mar 08, 2018 3:53:34 AM org.spark_project.jetty.server.handler.ContextHandler 
doStop
INFO: Stopped 
o.s.j.s.ServletContextHandler@7534dbd6{/static/streaming,null,UNAVAILABLE,@Spark}
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: StreamingContext stopped successfully
Mar 08, 2018 3:53:34 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@68c1e134{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 08, 2018 3:53:34 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java 
(default-cli) on project word-count-beam: An exception occured while executing 
the Java class. Failed to wait the pipeline until finish: 
org.apache.beam.runners.spark.SparkPipelineResult$StreamingMode@693d1112 -> 
[Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

***********************************************************
***********************************************************
*************************Tear Down*************************
The Pub/Sub topic has been deleted: 
projects/apache-beam-testing/topics/leaderboard-jenkins-0308035304-6f139be4
The Pub/Sub subscription has been deleted: 
projects/apache-beam-testing/subscriptions/leaderboard-jenkins-0308035304-6f139be4
***********************************************************
***********************************************************
[ERROR] Failed command
:runners:spark:runMobileGamingJavaSpark FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:spark:runMobileGamingJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' 
> finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 3m 44s
2 actionable tasks: 2 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Reply via email to