See 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/1707/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-5355] Add GroupByKeyLoadTest

------------------------------------------
[...truncated 29.39 MB...]
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ResultStage 341 (MapPartitionsRDD[2143] at map 
at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 
2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 341.0 with 4 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 341.0 (TID 359, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8188 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 341.0 (TID 360, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8188 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 341.0 (TID 361, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8188 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 341.0 (TID 362, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8188 bytes)
    [Executor task launch worker for task 359] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 341.0 (TID 359)
    [Executor task launch worker for task 361] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 341.0 (TID 361)
    [Executor task launch worker for task 360] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 341.0 (TID 360)
    [Executor task launch worker for task 362] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 341.0 (TID 362)
    [Executor task launch worker for task 359] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 359] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 359] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1811_0 locally
    [Executor task launch worker for task 359] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2126_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2126_0 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 360] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 360] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 360] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1811_1 locally
    [Executor task launch worker for task 360] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2126_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2126_1 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 361] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 361] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 361] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1811_2 locally
    [Executor task launch worker for task 362] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 361] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2126_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 362] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2126_2 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 359] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 341.0 (TID 
359). 59881 bytes result sent to driver
    [Executor task launch worker for task 362] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1811_3 locally
    [Executor task launch worker for task 362] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2126_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2126_3 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 360] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 341.0 (TID 
360). 59881 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 341.0 (TID 359) in 18 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 341.0 (TID 360) in 18 ms on localhost (executor 
driver) (2/4)
    [Executor task launch worker for task 361] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 341.0 (TID 
361). 59881 bytes result sent to driver
    [Executor task launch worker for task 362] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 341.0 (TID 
362). 59881 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 341.0 (TID 361) in 19 ms on localhost (executor 
driver) (3/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 341.0 (TID 362) in 21 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 341.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ResultStage 341 (foreach at UnboundedDataset.java:80) finished in 0.035 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - 
Job 26 finished: foreach at UnboundedDataset.java:80, took 0.108219 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Finished job streaming job 1538753791500 ms.2 from job set of time 
1538753791500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Starting job streaming job 1538753791500 ms.3 from job set of time 
1538753791500 ms
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting 
job: foreach at UnboundedDataset.java:80
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2157 (mapToPair at GroupCombineFunctions.java:56)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2185 (mapToPair at GroupCombineFunctions.java:56)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 27 (foreach at UnboundedDataset.java:80) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 363 (foreach at UnboundedDataset.java:80)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 350, ShuffleMapStage 361, 
ShuffleMapStage 362, ShuffleMapStage 359, ShuffleMapStage 351, ShuffleMapStage 
360, ShuffleMapStage 352)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 359)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 357 (MapPartitionsRDD[2157] at mapToPair at 
GroupCombineFunctions.java:56), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_84 stored as values in memory (estimated size 156.8 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_84_piece0 stored as bytes in memory (estimated size 33.8 KB, 
free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_84_piece0 in memory on localhost:40535 (size: 33.8 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 84 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 357 (MapPartitionsRDD[2157] at 
mapToPair at GroupCombineFunctions.java:56) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 357.0 with 4 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 357.0 (TID 363, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8177 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 357.0 (TID 364, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8177 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 357.0 (TID 365, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8177 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 357.0 (TID 366, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8177 bytes)
    [Executor task launch worker for task 366] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 357.0 (TID 366)
    [Executor task launch worker for task 365] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 357.0 (TID 365)
    [Executor task launch worker for task 364] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 357.0 (TID 364)
    [Executor task launch worker for task 363] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 357.0 (TID 363)
    [Executor task launch worker for task 365] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1929_2 locally
    [Executor task launch worker for task 364] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1929_1 locally
    [Executor task launch worker for task 366] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1929_3 locally
    [Executor task launch worker for task 363] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1929_0 locally
    [Executor task launch worker for task 365] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 357.0 (TID 
365). 59509 bytes result sent to driver
    [Executor task launch worker for task 366] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 357.0 (TID 
366). 59509 bytes result sent to driver
    [Executor task launch worker for task 364] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 357.0 (TID 
364). 59509 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 357.0 (TID 365) in 11 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 357.0 (TID 366) in 12 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 357.0 (TID 364) in 12 ms on localhost (executor 
driver) (3/4)
    [Executor task launch worker for task 363] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 357.0 (TID 
363). 59509 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 357.0 (TID 363) in 16 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 357.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 357 (mapToPair at GroupCombineFunctions.java:56) finished in 
0.025 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ShuffleMapStage 359, ResultStage 363)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 359 (MapPartitionsRDD[2185] at mapToPair at 
GroupCombineFunctions.java:56), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_85 stored as values in memory (estimated size 186.7 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_85_piece0 stored as bytes in memory (estimated size 41.2 KB, 
free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_85_piece0 in memory on localhost:40535 (size: 41.2 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 85 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 5 missing tasks from ShuffleMapStage 359 (MapPartitionsRDD[2185] at 
mapToPair at GroupCombineFunctions.java:56) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 359.0 with 5 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 359.0 (TID 367, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8316 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 359.0 (TID 368, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8316 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 359.0 (TID 369, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8316 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 359.0 (TID 370, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8316 bytes)
    [Executor task launch worker for task 367] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 359.0 (TID 367)
    [Executor task launch worker for task 368] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 359.0 (TID 368)
    [Executor task launch worker for task 369] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 359.0 (TID 369)
    [Executor task launch worker for task 370] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 359.0 (TID 370)
    [Executor task launch worker for task 367] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 368] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 370] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 367] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 370] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 368] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 370] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1854_3 locally
    [Executor task launch worker for task 367] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1854_0 locally
    [Executor task launch worker for task 368] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1854_1 locally
    [Executor task launch worker for task 370] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2169_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 368] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2169_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 367] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2169_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2169_3 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2169_1 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2169_0 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 369] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 369] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 369] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1854_2 locally
    [Executor task launch worker for task 369] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2169_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2169_2 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 370] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 359.0 (TID 
370). 59939 bytes result sent to driver
    [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 4.0 in stage 359.0 (TID 371, localhost, executor driver, 
partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 371] INFO 
org.apache.spark.executor.Executor - Running task 4.0 in stage 359.0 (TID 371)
    [Executor task launch worker for task 367] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 359.0 (TID 
367). 59939 bytes result sent to driver
    [Executor task launch worker for task 368] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 359.0 (TID 
368). 59939 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 359.0 (TID 370) in 13 ms on localhost (executor 
driver) (1/5)
    [Executor task launch worker for task 369] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 359.0 (TID 
369). 59939 bytes result sent to driver
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 359.0 (TID 367) in 15 ms on localhost (executor 
driver) (2/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 359.0 (TID 368) in 15 ms on localhost (executor 
driver) (3/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 359.0 (TID 369) in 16 ms on localhost (executor 
driver) (4/5)
    [Executor task launch worker for task 371] INFO 
org.apache.spark.executor.Executor - Finished task 4.0 in stage 359.0 (TID 
371). 59466 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 4.0 in stage 359.0 (TID 371) in 12 ms on localhost (executor 
driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 359.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 359 (mapToPair at GroupCombineFunctions.java:56) finished in 
0.031 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ResultStage 363)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ResultStage 363 (MapPartitionsRDD[2214] at map at 
TranslationUtils.java:129), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_86 stored as values in memory (estimated size 218.0 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_86_piece0 stored as bytes in memory (estimated size 51.2 KB, 
free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_86_piece0 in memory on localhost:40535 (size: 51.2 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 86 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ResultStage 363 (MapPartitionsRDD[2214] at map 
at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 
2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 363.0 with 4 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 363.0 (TID 372, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8188 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 363.0 (TID 373, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8188 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 363.0 (TID 374, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8188 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 363.0 (TID 375, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8188 bytes)
    [Executor task launch worker for task 372] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 363.0 (TID 372)
    [Executor task launch worker for task 373] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 363.0 (TID 373)
    [Executor task launch worker for task 374] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 363.0 (TID 374)
    [Executor task launch worker for task 375] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 363.0 (TID 375)
    [Executor task launch worker for task 372] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 372] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 372] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1882_0 locally
    [Executor task launch worker for task 372] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2197_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2197_0 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 373] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 373] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 374] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 375] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 373] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1882_1 locally
    [Executor task launch worker for task 374] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 375] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 375] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1882_3 locally
    [Executor task launch worker for task 374] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_1882_2 locally
    [Executor task launch worker for task 373] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2197_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 375] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2197_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 374] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2197_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2197_1 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2197_3 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 372] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 363.0 (TID 
372). 59881 bytes result sent to driver
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2197_2 in memory on localhost:40535 (size: 4.0 B, free: 13.5 GB)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 363.0 (TID 372) in 16 ms on localhost (executor 
driver) (1/4)
    [Executor task launch worker for task 373] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 363.0 (TID 
373). 59881 bytes result sent to driver
    [Executor task launch worker for task 375] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 363.0 (TID 
375). 59881 bytes result sent to driver
    [Executor task launch worker for task 374] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 363.0 (TID 
374). 59881 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 363.0 (TID 375) in 18 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 363.0 (TID 373) in 20 ms on localhost (executor 
driver) (3/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 363.0 (TID 374) in 19 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 363.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ResultStage 363 (foreach at UnboundedDataset.java:80) finished in 0.038 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - 
Job 27 finished: foreach at UnboundedDataset.java:80, took 0.104737 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Finished job streaming job 1538753791500 ms.3 from job set of time 
1538753791500 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Total delay: 9.237 s for time 1538753791500 ms (execution: 0.456 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@2152afcd{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@dc948c1{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@20836caa{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - 
StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - 
Stopped Spark@7482bdaf{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at 
http://localhost:4040
    [dispatcher-event-loop-2] INFO 
org.apache.spark.MapOutputTrackerMasterEndpoint - 
MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
    [dispatcher-event-loop-0] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext

Gradle Test Executor 286 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-e205209d-4e38-4755-a486-535703b568bf

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest 
> testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - 
Stopped Spark@3ee51d2c{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at 
http://localhost:4041
    [dispatcher-event-loop-3] INFO 
org.apache.spark.MapOutputTrackerMasterEndpoint - 
MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext

Gradle Test Executor 288 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-c8b34092-2531-4c99-8558-e02d4406714c
Finished generating test XML results (0.106 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.106 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
Invalidating in-memory cache of 
/home/jenkins/.gradle/caches/journal-1/file-access.bin
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 
4,5,main]) completed. Took 10 mins 34.713 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 17m 33s
40 actionable tasks: 36 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/vvepurj2cxcak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Reply via email to