See 
<https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1165/display/redirect>

Changes:


------------------------------------------
[...truncated 3.78 MB...]
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 4.0 (TID 20) in 15 ms on localhost (executor 
driver) (3/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 4.0 (TID 23) in 15 ms on localhost (executor 
driver) (4/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 4.0, whose tasks have all completed, from pool 
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 4 (repartition at GroupCombineFunctions.java:191) 
finished in 0.033 s
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ShuffleMapStage 5, ResultStage 6)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ShuffleMapStage 5 (MapPartitionsRDD[106] at repartition at 
GroupCombineFunctions.java:191), which has no missing parents
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_6 stored as values in memory (estimated size 26.3 KB, 
free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_6_piece0 stored as bytes in memory (estimated size 11.5 
KB, free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_6_piece0 in memory on localhost:39483 (size: 11.5 KB, 
free: 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 6 from broadcast at DAGScheduler.scala:1184
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ShuffleMapStage 5 (MapPartitionsRDD[106] 
at repartition at GroupCombineFunctions.java:191) (first 15 tasks are for 
partitions Vector(0, 1, 2, 3))
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 5.0 with 4 tasks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 5.0 (TID 24, localhost, executor driver, 
partition 0, NODE_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 5.0 (TID 25, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 5.0 (TID 26, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 5.0 (TID 27, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 7927 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 5.0 (TID 24)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 5.0 (TID 25)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 5.0 (TID 26)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 5.0 (TID 27)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26). 5149 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27). 5149 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25). 5149 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 5.0 (TID 25) in 41 ms on localhost (executor 
driver) (1/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 5.0 (TID 26) in 40 ms on localhost (executor 
driver) (2/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 5.0 (TID 27) in 40 ms on localhost (executor 
driver) (3/4)
Nov 09, 2020 11:08:57 AM 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/a2c4021d-e627-44b3-9aa2-0b7946612187,
 shard=0, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00000-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/eda1c5db-edda-4457-8b52-031831b617da,
 shard=1, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00001-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/ab5e0046-01ea-4b20-8624-33be88fa1967,
 shard=2, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00002-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/46b90119-9cf9-4201-936a-541ccff84789,
 shard=3, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7cb5b771, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/counts-00003-of-00004
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/46b90119-9cf9-4201-936a-541ccff84789
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/eda1c5db-edda-4457-8b52-031831b617da
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/ab5e0046-01ea-4b20-8624-33be88fa1967
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/a2c4021d-e627-44b3-9aa2-0b7946612187
Nov 09, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
WARNING: Failed to match temporary files under: 
[/tmp/groovy-generated-1893103289173540452-tmpdir/word-count-beam/.temp-beam-e2069e59-61a8-4264-b4c2-dff7fba8604d/].
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12857 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 104 ms on localhost (executor 
driver) (4/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) 
finished in 0.114 s
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
 MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no 
missing parents
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, 
free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 
KB, free 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:39483 (size: 7.3 KB, 
free: 13.5 GB)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
 MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 
tasks are for partitions Vector(0, 1, 2, 3))
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, 
partition 0, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, 
partition 1, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, 
partition 2, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, 
partition 3, NODE_LOCAL, 7938 bytes)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6453 bytes result sent to driver
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 14 ms on localhost (executor 
driver) (1/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 14 ms on localhost (executor 
driver) (2/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 16 ms on localhost (executor 
driver) (3/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 17 ms on localhost (executor 
driver) (4/4)
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.026 s
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.189524 s
Nov 09, 2020 11:08:57 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 09, 2020 11:08:57 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@2221df2b{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 09, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 40s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/2yolp7tbrnlko

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to