See 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/1277/display/redirect?page=changes>

Changes:

[herohde] [BEAM-3286] Add Go support for side input

[herohde] CR: address review comments

------------------------------------------
[...truncated 27.11 MB...]
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_95_piece0 in memory on localhost:35939 (size: 53.4 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 95 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ResultStage 438 (MapPartitionsRDD[2458] at map 
at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 
2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 438.0 with 4 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 438.0 (TID 411, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8248 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 438.0 (TID 412, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8248 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 438.0 (TID 413, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8248 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 438.0 (TID 414, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8248 bytes)
    [Executor task launch worker for task 412] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 438.0 (TID 412)
    [Executor task launch worker for task 414] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 438.0 (TID 414)
    [Executor task launch worker for task 411] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 438.0 (TID 411)
    [Executor task launch worker for task 413] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 438.0 (TID 413)
    [Executor task launch worker for task 412] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 413] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 412] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 413] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 413] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2126_2 locally
    [Executor task launch worker for task 412] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2126_1 locally
    [Executor task launch worker for task 414] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 414] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 1 ms
    [Executor task launch worker for task 414] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2126_3 locally
    [Executor task launch worker for task 412] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2441_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 413] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2441_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 414] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2441_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2441_1 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2441_3 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2441_2 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 413] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 438.0 (TID 
413). 59881 bytes result sent to driver
    [Executor task launch worker for task 412] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 438.0 (TID 
412). 59881 bytes result sent to driver
    [Executor task launch worker for task 414] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 438.0 (TID 
414). 59881 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 438.0 (TID 412) in 15 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 438.0 (TID 413) in 15 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 438.0 (TID 414) in 15 ms on localhost (executor 
driver) (3/4)
    [Executor task launch worker for task 411] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 411] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 1 ms
    [Executor task launch worker for task 411] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2126_0 locally
    [Executor task launch worker for task 411] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2441_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2441_0 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 411] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 438.0 (TID 
411). 59881 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 438.0 (TID 411) in 24 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 438.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ResultStage 438 (foreach at UnboundedDataset.java:80) finished in 0.037 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - 
Job 30 finished: foreach at UnboundedDataset.java:80, took 0.117262 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Finished job streaming job 1534636630500 ms.2 from job set of time 
1534636630500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Starting job streaming job 1534636630500 ms.3 from job set of time 
1534636630500 ms
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting 
job: foreach at UnboundedDataset.java:80
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2472 (mapToPair at GroupCombineFunctions.java:54)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2500 (mapToPair at GroupCombineFunctions.java:54)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 31 (foreach at UnboundedDataset.java:80) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 463 (foreach at UnboundedDataset.java:80)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 452, ShuffleMapStage 462, 
ShuffleMapStage 459, ShuffleMapStage 460, ShuffleMapStage 461, ShuffleMapStage 
453, ShuffleMapStage 447, ShuffleMapStage 454)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 459)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 457 (MapPartitionsRDD[2472] at mapToPair at 
GroupCombineFunctions.java:54), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_96 stored as values in memory (estimated size 158.7 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_96_piece0 stored as bytes in memory (estimated size 34.2 KB, 
free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_96_piece0 in memory on localhost:35939 (size: 34.2 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 96 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 457 (MapPartitionsRDD[2472] at 
mapToPair at GroupCombineFunctions.java:54) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 457.0 with 4 
tasks
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 457.0 (TID 415, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8237 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 457.0 (TID 416, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8237 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 457.0 (TID 417, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8237 bytes)
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 457.0 (TID 418, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8237 bytes)
    [Executor task launch worker for task 415] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 457.0 (TID 415)
    [Executor task launch worker for task 416] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 457.0 (TID 416)
    [Executor task launch worker for task 418] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 457.0 (TID 418)
    [Executor task launch worker for task 417] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 457.0 (TID 417)
    [Executor task launch worker for task 416] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_1 locally
    [Executor task launch worker for task 418] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_3 locally
    [Executor task launch worker for task 415] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_0 locally
    [Executor task launch worker for task 417] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_2 locally
    [Executor task launch worker for task 416] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 457.0 (TID 
416). 59466 bytes result sent to driver
    [Executor task launch worker for task 418] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 457.0 (TID 
418). 59466 bytes result sent to driver
    [Executor task launch worker for task 415] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 457.0 (TID 
415). 59466 bytes result sent to driver
    [Executor task launch worker for task 417] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 457.0 (TID 
417). 59509 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 457.0 (TID 418) in 13 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 457.0 (TID 416) in 13 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 457.0 (TID 417) in 15 ms on localhost (executor 
driver) (3/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 457.0 (TID 415) in 15 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 457.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 457 (mapToPair at GroupCombineFunctions.java:54) finished in 
0.025 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ShuffleMapStage 459, ResultStage 463)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 459 (MapPartitionsRDD[2500] at mapToPair at 
GroupCombineFunctions.java:54), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_97 stored as values in memory (estimated size 192.1 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_97_piece0 stored as bytes in memory (estimated size 42.7 KB, 
free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_97_piece0 in memory on localhost:35939 (size: 42.7 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 97 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 5 missing tasks from ShuffleMapStage 459 (MapPartitionsRDD[2500] at 
mapToPair at GroupCombineFunctions.java:54) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 459.0 with 5 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 459.0 (TID 419, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 459.0 (TID 420, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 459.0 (TID 421, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8376 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 459.0 (TID 422, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8376 bytes)
    [Executor task launch worker for task 421] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 459.0 (TID 421)
    [Executor task launch worker for task 420] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 459.0 (TID 420)
    [Executor task launch worker for task 422] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 459.0 (TID 422)
    [Executor task launch worker for task 419] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 459.0 (TID 419)
    [Executor task launch worker for task 419] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 420] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 419] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 422] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 422] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 420] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 419] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2169_0 locally
    [Executor task launch worker for task 422] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2169_3 locally
    [Executor task launch worker for task 420] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2169_1 locally
    [Executor task launch worker for task 421] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 421] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 419] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 422] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 420] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 421] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2169_2 locally
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2484_0 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2484_1 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2484_3 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 421] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2484_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2484_2 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 422] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 459.0 (TID 
422). 59939 bytes result sent to driver
    [Executor task launch worker for task 419] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 459.0 (TID 
419). 59939 bytes result sent to driver
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 4.0 in stage 459.0 (TID 423, localhost, executor driver, 
partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 423] INFO 
org.apache.spark.executor.Executor - Running task 4.0 in stage 459.0 (TID 423)
    [Executor task launch worker for task 421] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 459.0 (TID 
421). 59939 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 459.0 (TID 421) in 16 ms on localhost (executor 
driver) (1/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 459.0 (TID 422) in 16 ms on localhost (executor 
driver) (2/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 459.0 (TID 419) in 16 ms on localhost (executor 
driver) (3/5)
    [Executor task launch worker for task 420] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 459.0 (TID 
420). 59939 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 459.0 (TID 420) in 19 ms on localhost (executor 
driver) (4/5)
    [Executor task launch worker for task 423] INFO 
org.apache.spark.executor.Executor - Finished task 4.0 in stage 459.0 (TID 
423). 59466 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 4.0 in stage 459.0 (TID 423) in 23 ms on localhost (executor 
driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 459.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 459 (mapToPair at GroupCombineFunctions.java:54) finished in 
0.054 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ResultStage 463)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ResultStage 463 (MapPartitionsRDD[2529] at map at 
TranslationUtils.java:129), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_98 stored as values in memory (estimated size 226.2 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_98_piece0 stored as bytes in memory (estimated size 53.5 KB, 
free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_98_piece0 in memory on localhost:35939 (size: 53.5 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 98 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ResultStage 463 (MapPartitionsRDD[2529] at map 
at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 
2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 463.0 with 4 
tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 463.0 (TID 424, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8248 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 463.0 (TID 425, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8248 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 463.0 (TID 426, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8248 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 463.0 (TID 427, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8248 bytes)
    [Executor task launch worker for task 424] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 463.0 (TID 424)
    [Executor task launch worker for task 425] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 463.0 (TID 425)
    [Executor task launch worker for task 426] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 463.0 (TID 426)
    [Executor task launch worker for task 427] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 463.0 (TID 427)
    [Executor task launch worker for task 426] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 424] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 426] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 426] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2197_2 locally
    [Executor task launch worker for task 424] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 1 ms
    [Executor task launch worker for task 426] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 424] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2197_0 locally
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2512_2 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 424] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 427] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 427] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 427] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2197_3 locally
    [Executor task launch worker for task 425] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 425] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 1 ms
    [Executor task launch worker for task 425] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2197_1 locally
    [Executor task launch worker for task 427] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2512_3 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 425] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2512_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2512_1 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2512_0 in memory on localhost:35939 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 425] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 463.0 (TID 
425). 59881 bytes result sent to driver
    [Executor task launch worker for task 424] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 463.0 (TID 
424). 59881 bytes result sent to driver
    [Executor task launch worker for task 427] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 463.0 (TID 
427). 59881 bytes result sent to driver
    [Executor task launch worker for task 426] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 463.0 (TID 
426). 59881 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 463.0 (TID 427) in 27 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 463.0 (TID 424) in 29 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 463.0 (TID 425) in 29 ms on localhost (executor 
driver) (3/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 463.0 (TID 426) in 28 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 463.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ResultStage 463 (foreach at UnboundedDataset.java:80) finished in 0.060 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - 
Job 31 finished: foreach at UnboundedDataset.java:80, took 0.150650 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Finished job streaming job 1534636630500 ms.3 from job set of time 
1534636630500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Total delay: 9.352 s for time 1534636630500 ms (execution: 0.534 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@4d83387a{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@126c1134{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@35ed6f03{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - 
StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - 
Stopped Spark@58b8b4ef{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at 
http://localhost:4040
    [dispatcher-event-loop-1] INFO 
org.apache.spark.MapOutputTrackerMasterEndpoint - 
MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
    [dispatcher-event-loop-2] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext

Gradle Test Executor 278 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-afb2bffe-5c12-4322-a5dc-9f70e8b4dfc2

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest 
> testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - 
Stopped Spark@7e341a29{HTTP/1.1,[http/1.1]}{127.0.0.1:4042}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at 
http://localhost:4042
    [dispatcher-event-loop-2] INFO 
org.apache.spark.MapOutputTrackerMasterEndpoint - 
MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext

Gradle Test Executor 280 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-b02889d8-e3b5-46e8-b0af-6eda18bc8974
Finished generating test XML results (0.103 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.089 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 
7,5,main]) completed. Took 10 mins 14.37 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 5.0.
See 
https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 47s
40 actionable tasks: 32 executed, 8 from cache

Publishing build scan...
https://gradle.com/s/qdqgrjnhgthns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Reply via email to