See 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/2179/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

------------------------------------------
[...truncated 29.79 MB...]
    [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting 
job: foreach at UnboundedDataset.java:79
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2824 (mapPartitionsToPair at 
SparkGroupAlsoByWindowViaWindowSet.java:564)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 832, ShuffleMapStage 818, 
ShuffleMapStage 836, ShuffleMapStage 822, ShuffleMapStage 816, ShuffleMapStage 
834, ShuffleMapStage 820, ShuffleMapStage 838, ShuffleMapStage 824)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 832)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 829 (MapPartitionsRDD[2787] at mapToPair at 
GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.6 KB, 
free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_143_piece0 in memory on localhost:44567 (size: 54.6 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 143 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 829 (MapPartitionsRDD[2787] at 
mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 829.0 with 4 
tasks
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 829.0 (TID 642, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 829.0 (TID 643, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 829.0 (TID 644, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8165 bytes)
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 829.0 (TID 645, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8165 bytes)
    [Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 829.0 (TID 643)
    [Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 829.0 (TID 644)
    [Executor task launch worker for task 642] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 829.0 (TID 642)
    [Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 829.0 (TID 645)
    [Executor task launch worker for task 645] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
    [Executor task launch worker for task 642] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
    [Executor task launch worker for task 643] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
    [Executor task launch worker for task 644] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
    [Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 829.0 (TID 
645). 59466 bytes result sent to driver
    [Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 829.0 (TID 
644). 59466 bytes result sent to driver
    [Executor task launch worker for task 642] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 829.0 (TID 
642). 59466 bytes result sent to driver
    [Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 829.0 (TID 
643). 59466 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 829.0 (TID 645) in 12 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 829.0 (TID 642) in 13 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 829.0 (TID 644) in 12 ms on localhost (executor 
driver) (3/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 829.0 (TID 643) in 12 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 829.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 829 (mapToPair at GroupCombineFunctions.java:57) finished in 
0.020 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ShuffleMapStage 832, ResultStage 839, ShuffleMapStage 831)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 831 (MapPartitionsRDD[2815] at mapToPair at 
GroupCombineFunctions.java:57), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_144 stored as values in memory (estimated size 216.3 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_144_piece0 stored as bytes in memory (estimated size 64.1 KB, 
free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_144_piece0 in memory on localhost:44567 (size: 64.1 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 144 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 5 missing tasks from ShuffleMapStage 831 (MapPartitionsRDD[2815] at 
mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 831.0 with 5 
tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 831.0 (TID 646, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 831.0 (TID 647, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 831.0 (TID 648, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8436 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 831.0 (TID 649, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8436 bytes)
    [Executor task launch worker for task 647] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 831.0 (TID 647)
    [Executor task launch worker for task 648] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 831.0 (TID 648)
    [Executor task launch worker for task 649] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 831.0 (TID 649)
    [Executor task launch worker for task 646] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 831.0 (TID 646)
    [Executor task launch worker for task 647] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 648] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 646] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 647] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 646] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 648] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 648] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally
    [Executor task launch worker for task 647] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally
    [Executor task launch worker for task 646] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally
    [Executor task launch worker for task 647] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 648] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 646] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2799_2 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 4 blocks
    [Executor task launch worker for task 649] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2799_1 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2799_0 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 649] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2799_3 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 648] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 831.0 (TID 
648). 59940 bytes result sent to driver
    [Executor task launch worker for task 647] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 831.0 (TID 
647). 59940 bytes result sent to driver
    [Executor task launch worker for task 646] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 831.0 (TID 
646). 59940 bytes result sent to driver
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 4.0 in stage 831.0 (TID 650, localhost, executor driver, 
partition 4, PROCESS_LOCAL, 7968 bytes)
    [Executor task launch worker for task 650] INFO 
org.apache.spark.executor.Executor - Running task 4.0 in stage 831.0 (TID 650)
    [Executor task launch worker for task 649] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 831.0 (TID 
649). 59940 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 831.0 (TID 648) in 15 ms on localhost (executor 
driver) (1/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 831.0 (TID 647) in 15 ms on localhost (executor 
driver) (2/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 831.0 (TID 646) in 15 ms on localhost (executor 
driver) (3/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 831.0 (TID 649) in 15 ms on localhost (executor 
driver) (4/5)
    [Executor task launch worker for task 650] INFO 
org.apache.spark.executor.Executor - Finished task 4.0 in stage 831.0 (TID 
650). 59467 bytes result sent to driver
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 4.0 in stage 831.0 (TID 650) in 13 ms on localhost (executor 
driver) (5/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 831.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 831 (mapToPair at GroupCombineFunctions.java:57) finished in 
0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ShuffleMapStage 832, ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 832 (MapPartitionsRDD[2824] at mapPartitionsToPair 
at SparkGroupAlsoByWindowViaWindowSet.java:564), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_145 stored as values in memory (estimated size 217.5 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_145_piece0 stored as bytes in memory (estimated size 64.1 KB, 
free 13.5 GB)
    [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_145_piece0 in memory on localhost:44567 (size: 64.1 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 145 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 5 missing tasks from ShuffleMapStage 832 (MapPartitionsRDD[2824] at 
mapPartitionsToPair at SparkGroupAlsoByWindowViaWindowSet.java:564) (first 15 
tasks are for partitions Vector(0, 1, 2, 3, 4))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 832.0 with 5 
tasks
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 832.0 (TID 651, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 832.0 (TID 652, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 832.0 (TID 653, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 7638 bytes)
    [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 832.0 (TID 654, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 651] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 832.0 (TID 651)
    [Executor task launch worker for task 652] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 832.0 (TID 652)
    [Executor task launch worker for task 653] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 832.0 (TID 653)
    [Executor task launch worker for task 654] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 832.0 (TID 654)
    [Executor task launch worker for task 654] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 651] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 654] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 651] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 652] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 652] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 653] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 653] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 651] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 832.0 (TID 
651). 59896 bytes result sent to driver
    [Executor task launch worker for task 652] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 832.0 (TID 
652). 59853 bytes result sent to driver
    [Executor task launch worker for task 654] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 832.0 (TID 
654). 59896 bytes result sent to driver
    [dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 4.0 in stage 832.0 (TID 655, localhost, executor driver, 
partition 4, PROCESS_LOCAL, 7638 bytes)
    [Executor task launch worker for task 655] INFO 
org.apache.spark.executor.Executor - Running task 4.0 in stage 832.0 (TID 655)
    [Executor task launch worker for task 653] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 832.0 (TID 
653). 59853 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 832.0 (TID 651) in 14 ms on localhost (executor 
driver) (1/5)
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 832.0 (TID 652) in 14 ms on localhost (executor 
driver) (2/5)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 832.0 (TID 654) in 14 ms on localhost (executor 
driver) (3/5)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 832.0 (TID 653) in 15 ms on localhost (executor 
driver) (4/5)
    [Executor task launch worker for task 655] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 655] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 655] INFO 
org.apache.spark.executor.Executor - Finished task 4.0 in stage 832.0 (TID 
655). 59896 bytes result sent to driver
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 4.0 in stage 832.0 (TID 655) in 13 ms on localhost (executor 
driver) (5/5)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 832.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ShuffleMapStage 832 (mapPartitionsToPair at 
SparkGroupAlsoByWindowViaWindowSet.java:564) finished in 0.034 s
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
looking for newly runnable stages
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
running: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
waiting: Set(ResultStage 839)
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
failed: Set()
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ResultStage 839 (MapPartitionsRDD[2844] at map at 
TranslationUtils.java:128), which has no missing parents
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_146 stored as values in memory (estimated size 188.2 KB, free 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_146_piece0 stored as bytes in memory (estimated size 58.1 KB, 
free 13.5 GB)
    [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_146_piece0 in memory on localhost:44567 (size: 58.1 KB, free: 
13.5 GB)
    [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 146 from broadcast at DAGScheduler.scala:1039
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ResultStage 839 (MapPartitionsRDD[2844] at map 
at TranslationUtils.java:128) (first 15 tasks are for partitions Vector(0, 1, 
2, 3))
    [dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 839.0 with 4 
tasks
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 839.0 (TID 656, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 839.0 (TID 657, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 839.0 (TID 658, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8132 bytes)
    [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 839.0 (TID 659, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8132 bytes)
    [Executor task launch worker for task 657] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 839.0 (TID 657)
    [Executor task launch worker for task 656] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 839.0 (TID 656)
    [Executor task launch worker for task 658] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 839.0 (TID 658)
    [Executor task launch worker for task 659] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 839.0 (TID 659)
    [Executor task launch worker for task 657] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 656] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 657] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 656] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 658] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 658] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 1 ms
    [Executor task launch worker for task 656] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally
    [Executor task launch worker for task 657] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally
    [Executor task launch worker for task 658] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally
    [Executor task launch worker for task 659] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty 
blocks out of 5 blocks
    [Executor task launch worker for task 659] INFO 
org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches 
in 0 ms
    [Executor task launch worker for task 659] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally
    [Executor task launch worker for task 656] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 657] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [Executor task launch worker for task 658] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2827_1 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 659] INFO 
org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes 
in memory (estimated size 4.0 B, free 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2827_0 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2827_2 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added rdd_2827_3 in memory on localhost:44567 (size: 4.0 B, free: 13.5 GB)
    [Executor task launch worker for task 656] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 839.0 (TID 
656). 59881 bytes result sent to driver
    [Executor task launch worker for task 657] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 839.0 (TID 
657). 59881 bytes result sent to driver
    [Executor task launch worker for task 658] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 839.0 (TID 
658). 59881 bytes result sent to driver
    [Executor task launch worker for task 659] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 839.0 (TID 
659). 59881 bytes result sent to driver
    [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 0.0 in stage 839.0 (TID 656) in 15 ms on localhost (executor 
driver) (1/4)
    [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 1.0 in stage 839.0 (TID 657) in 15 ms on localhost (executor 
driver) (2/4)
    [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 2.0 in stage 839.0 (TID 658) in 15 ms on localhost (executor 
driver) (3/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Finished task 3.0 in stage 839.0 (TID 659) in 15 ms on localhost (executor 
driver) (4/4)
    [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSchedulerImpl - 
Removed TaskSet 839.0, whose tasks have all completed, from pool 
    [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
ResultStage 839 (foreach at UnboundedDataset.java:79) finished in 0.023 s
    [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - 
Job 35 finished: foreach at UnboundedDataset.java:79, took 0.121113 s
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Finished job streaming job 1542061732000 ms.3 from job set of time 
1542061732000 ms
    [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Total delay: 6.420 s for time 1542061732000 ms (execution: 0.565 s)
    [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - 
Stopped JobScheduler
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@c79524a{/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@3f5303bf{/streaming/batch,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - 
Stopped 
o.s.j.s.ServletContextHandler@2eb7f2ac{/static/streaming,null,UNAVAILABLE,@Spark}
    [Test worker] INFO org.apache.spark.streaming.StreamingContext - 
StreamingContext stopped successfully
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - 
Stopped Spark@7179880a{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at 
http://localhost:4040
    [dispatcher-event-loop-0] INFO 
org.apache.spark.MapOutputTrackerMasterEndpoint - 
MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
    [dispatcher-event-loop-1] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext

Gradle Test Executor 291 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-16463eff-0a71-4aa1-8204-b689e79e3631

org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest 
> testUnboundedSourceMetrics STANDARD_ERROR
    [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - 
Stopped Spark@61ffbeff{HTTP/1.1,[http/1.1]}{127.0.0.1:4041}
    [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at 
http://localhost:4041
    [dispatcher-event-loop-3] INFO 
org.apache.spark.MapOutputTrackerMasterEndpoint - 
MapOutputTrackerMasterEndpoint stopped!
    [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - 
MemoryStore cleared
    [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager 
stopped
    [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - 
BlockManagerMaster stopped
    [dispatcher-event-loop-3] INFO 
org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint
 - OutputCommitCoordinator stopped!
    [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped 
SparkContext

Gradle Test Executor 293 finished executing tests.

> Task :beam-runners-spark:validatesRunnerStreaming
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called
[Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory 
/tmp/spark-9b90677a-c0f0-4d5b-9f57-360ebfc356f5
Finished generating test XML results (0.13 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/test-results/validatesRunnerStreaming>
Generating HTML test report...
Finished generating test html results (0.109 secs) into: 
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerStreaming>
Packing task ':beam-runners-spark:validatesRunnerStreaming'
:beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 
3,5,main]) completed. Took 10 mins 26.931 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-runners-spark:validatesRunnerBatch'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark_Gradle/ws/src/runners/spark/build/reports/tests/validatesRunnerBatch/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 2s
43 actionable tasks: 39 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/s4xckma4gkvws

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to