See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/961/display/redirect>
------------------------------------------ [...truncated 1.24 MB...] 19/09/16 12:17:20 INFO TaskSetManager: Finished task 0.0 in stage 128.0 (TID 154) in 3493 ms on localhost (executor driver) (1/1) 19/09/16 12:17:20 INFO TaskSchedulerImpl: Removed TaskSet 128.0, whose tasks have all completed, from pool 19/09/16 12:17:20 INFO DAGScheduler: ShuffleMapStage 128 (repartition at GroupCombineFunctions.java:191) finished in 3.501 s 19/09/16 12:17:20 INFO DAGScheduler: looking for newly runnable stages 19/09/16 12:17:20 INFO DAGScheduler: running: Set() 19/09/16 12:17:20 INFO DAGScheduler: waiting: Set(ShuffleMapStage 129, ShuffleMapStage 130, ResultStage 131) 19/09/16 12:17:20 INFO DAGScheduler: failed: Set() 19/09/16 12:17:20 INFO DAGScheduler: Submitting ShuffleMapStage 129 (MapPartitionsRDD[871] at mapToPair at GroupCombineFunctions.java:55), which has no missing parents 19/09/16 12:17:20 INFO MemoryStore: Block broadcast_127 stored as values in memory (estimated size 24.6 KB, free 13.5 GB) 19/09/16 12:17:20 INFO MemoryStore: Block broadcast_127_piece0 stored as bytes in memory (estimated size 10.7 KB, free 13.5 GB) 19/09/16 12:17:20 INFO BlockManagerInfo: Added broadcast_127_piece0 in memory on localhost:41091 (size: 10.7 KB, free: 13.5 GB) 19/09/16 12:17:20 INFO SparkContext: Created broadcast 127 from broadcast at DAGScheduler.scala:1161 19/09/16 12:17:20 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 129 (MapPartitionsRDD[871] at mapToPair at GroupCombineFunctions.java:55) (first 15 tasks are for partitions Vector(0)) 19/09/16 12:17:20 INFO TaskSchedulerImpl: Adding task set 129.0 with 1 tasks 19/09/16 12:17:20 INFO TaskSetManager: Starting task 0.0 in stage 129.0 (TID 155, localhost, executor driver, partition 0, NODE_LOCAL, 7927 bytes) 19/09/16 12:17:20 INFO Executor: Running task 0.0 in stage 129.0 (TID 155) 19/09/16 12:17:20 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST 19/09/16 12:17:20 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST -> 0 artifacts 19/09/16 12:17:23 INFO GrpcLoggingService: Beam Fn Logging client connected. 19/09/16 12:17:23 INFO main: Logging handler created. 19/09/16 12:17:23 INFO main: semi_persistent_directory: /tmp 19/09/16 12:17:23 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/09/16 12:17:23 WARN get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1568636233.77_8985134c-6ece-4723-8d4d-af93e9d0d55a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 19/09/16 12:17:23 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1568636233.77', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint': u'localhost:47445'} 19/09/16 12:17:23 INFO __init__: Creating insecure control channel for localhost:44135. 19/09/16 12:17:23 INFO start: Status HTTP server running at localhost:38313 19/09/16 12:17:23 INFO __init__: Control channel established. 19/09/16 12:17:23 INFO __init__: Initializing SDKHarness with 12 workers. 19/09/16 12:17:23 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 158-1 19/09/16 12:17:23 INFO create_state_handler: Creating insecure state channel for localhost:34023. 19/09/16 12:17:23 INFO create_state_handler: State channel established. 19/09/16 12:17:23 INFO create_data_channel: Creating client data channel for localhost:38481 19/09/16 12:17:23 INFO GrpcDataService: Beam Fn Data client connected. 19/09/16 12:17:23 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks 19/09/16 12:17:23 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms 19/09/16 12:17:24 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/09/16 12:17:24 INFO run: No more requests from control plane 19/09/16 12:17:24 INFO run: SDK Harness waiting for in-flight requests to complete 19/09/16 12:17:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:24 INFO close: Closing all cached grpc data channels. 19/09/16 12:17:24 INFO close: Closing all cached gRPC state handlers. 19/09/16 12:17:24 INFO run: Done consuming work. 19/09/16 12:17:24 INFO main: Python sdk harness exiting. 19/09/16 12:17:24 INFO GrpcLoggingService: Logging client hanged up. 19/09/16 12:17:24 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:25 INFO Executor: Finished task 0.0 in stage 129.0 (TID 155). 12763 bytes result sent to driver 19/09/16 12:17:25 INFO TaskSetManager: Finished task 0.0 in stage 129.0 (TID 155) in 4774 ms on localhost (executor driver) (1/1) 19/09/16 12:17:25 INFO TaskSchedulerImpl: Removed TaskSet 129.0, whose tasks have all completed, from pool 19/09/16 12:17:25 INFO DAGScheduler: ShuffleMapStage 129 (mapToPair at GroupCombineFunctions.java:55) finished in 4.798 s 19/09/16 12:17:25 INFO DAGScheduler: looking for newly runnable stages 19/09/16 12:17:25 INFO DAGScheduler: running: Set() 19/09/16 12:17:25 INFO DAGScheduler: waiting: Set(ShuffleMapStage 130, ResultStage 131) 19/09/16 12:17:25 INFO DAGScheduler: failed: Set() 19/09/16 12:17:25 INFO DAGScheduler: Submitting ShuffleMapStage 130 (MapPartitionsRDD[882] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115), which has no missing parents 19/09/16 12:17:25 INFO MemoryStore: Block broadcast_128 stored as values in memory (estimated size 57.2 KB, free 13.5 GB) 19/09/16 12:17:25 INFO MemoryStore: Block broadcast_128_piece0 stored as bytes in memory (estimated size 22.9 KB, free 13.5 GB) 19/09/16 12:17:25 INFO BlockManagerInfo: Added broadcast_128_piece0 in memory on localhost:41091 (size: 22.9 KB, free: 13.5 GB) 19/09/16 12:17:25 INFO SparkContext: Created broadcast 128 from broadcast at DAGScheduler.scala:1161 19/09/16 12:17:25 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 130 (MapPartitionsRDD[882] at flatMapToPair at GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions Vector(0, 1)) 19/09/16 12:17:25 INFO TaskSchedulerImpl: Adding task set 130.0 with 2 tasks 19/09/16 12:17:25 INFO TaskSetManager: Starting task 1.0 in stage 130.0 (TID 156, localhost, executor driver, partition 1, NODE_LOCAL, 7760 bytes) 19/09/16 12:17:25 INFO Executor: Running task 1.0 in stage 130.0 (TID 156) 19/09/16 12:17:25 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks 19/09/16 12:17:25 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms 19/09/16 12:17:25 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST 19/09/16 12:17:25 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST -> 0 artifacts 19/09/16 12:17:30 INFO GrpcLoggingService: Beam Fn Logging client connected. 19/09/16 12:17:30 INFO main: Logging handler created. 19/09/16 12:17:30 INFO main: semi_persistent_directory: /tmp 19/09/16 12:17:30 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/09/16 12:17:30 WARN get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1568636233.77_8985134c-6ece-4723-8d4d-af93e9d0d55a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 19/09/16 12:17:30 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1568636233.77', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint': u'localhost:47445'} 19/09/16 12:17:30 INFO __init__: Creating insecure control channel for localhost:45331. 19/09/16 12:17:30 INFO start: Status HTTP server running at localhost:36679 19/09/16 12:17:30 INFO __init__: Control channel established. 19/09/16 12:17:30 INFO __init__: Initializing SDKHarness with 12 workers. 19/09/16 12:17:30 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 159-1 19/09/16 12:17:30 INFO create_state_handler: Creating insecure state channel for localhost:38245. 19/09/16 12:17:30 INFO create_state_handler: State channel established. 19/09/16 12:17:30 INFO create_data_channel: Creating client data channel for localhost:35215 19/09/16 12:17:30 INFO GrpcDataService: Beam Fn Data client connected. 19/09/16 12:17:30 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/09/16 12:17:30 INFO run: No more requests from control plane 19/09/16 12:17:30 INFO run: SDK Harness waiting for in-flight requests to complete 19/09/16 12:17:30 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:30 INFO close: Closing all cached grpc data channels. 19/09/16 12:17:30 INFO close: Closing all cached gRPC state handlers. 19/09/16 12:17:30 INFO run: Done consuming work. 19/09/16 12:17:30 INFO main: Python sdk harness exiting. 19/09/16 12:17:30 INFO GrpcLoggingService: Logging client hanged up. 19/09/16 12:17:31 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:31 INFO Executor: Finished task 1.0 in stage 130.0 (TID 156). 15229 bytes result sent to driver 19/09/16 12:17:31 INFO TaskSetManager: Starting task 0.0 in stage 130.0 (TID 157, localhost, executor driver, partition 0, PROCESS_LOCAL, 7977 bytes) 19/09/16 12:17:31 INFO Executor: Running task 0.0 in stage 130.0 (TID 157) 19/09/16 12:17:31 INFO TaskSetManager: Finished task 1.0 in stage 130.0 (TID 156) in 6323 ms on localhost (executor driver) (1/2) 19/09/16 12:17:31 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST 19/09/16 12:17:31 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST -> 0 artifacts 19/09/16 12:17:35 INFO GrpcLoggingService: Beam Fn Logging client connected. 19/09/16 12:17:35 INFO main: Logging handler created. 19/09/16 12:17:35 INFO main: semi_persistent_directory: /tmp 19/09/16 12:17:35 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/09/16 12:17:35 WARN get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1568636233.77_8985134c-6ece-4723-8d4d-af93e9d0d55a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 19/09/16 12:17:35 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1568636233.77', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint': u'localhost:47445'} 19/09/16 12:17:35 INFO start: Status HTTP server running at localhost:45819 19/09/16 12:17:35 INFO __init__: Creating insecure control channel for localhost:37659. 19/09/16 12:17:35 INFO __init__: Control channel established. 19/09/16 12:17:35 INFO __init__: Initializing SDKHarness with 12 workers. 19/09/16 12:17:35 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 160-1 19/09/16 12:17:35 INFO create_state_handler: Creating insecure state channel for localhost:46507. 19/09/16 12:17:35 INFO create_state_handler: State channel established. 19/09/16 12:17:35 INFO create_data_channel: Creating client data channel for localhost:44353 19/09/16 12:17:35 INFO GrpcDataService: Beam Fn Data client connected. 19/09/16 12:17:35 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/09/16 12:17:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:35 INFO run: No more requests from control plane 19/09/16 12:17:35 INFO run: SDK Harness waiting for in-flight requests to complete 19/09/16 12:17:35 INFO close: Closing all cached grpc data channels. 19/09/16 12:17:35 INFO close: Closing all cached gRPC state handlers. 19/09/16 12:17:35 INFO run: Done consuming work. 19/09/16 12:17:35 INFO main: Python sdk harness exiting. 19/09/16 12:17:35 INFO GrpcLoggingService: Logging client hanged up. 19/09/16 12:17:35 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:35 INFO Executor: Finished task 0.0 in stage 130.0 (TID 157). 13753 bytes result sent to driver 19/09/16 12:17:35 INFO TaskSetManager: Finished task 0.0 in stage 130.0 (TID 157) in 4612 ms on localhost (executor driver) (2/2) 19/09/16 12:17:35 INFO TaskSchedulerImpl: Removed TaskSet 130.0, whose tasks have all completed, from pool 19/09/16 12:17:35 INFO DAGScheduler: ShuffleMapStage 130 (flatMapToPair at GroupNonMergingWindowsFunctions.java:115) finished in 10.943 s 19/09/16 12:17:35 INFO DAGScheduler: looking for newly runnable stages 19/09/16 12:17:35 INFO DAGScheduler: running: Set() 19/09/16 12:17:35 INFO DAGScheduler: waiting: Set(ResultStage 131) 19/09/16 12:17:35 INFO DAGScheduler: failed: Set() 19/09/16 12:17:35 INFO DAGScheduler: Submitting ResultStage 131 (EmptyOutputSink_0 MapPartitionsRDD[887] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents 19/09/16 12:17:35 INFO MemoryStore: Block broadcast_129 stored as values in memory (estimated size 26.1 KB, free 13.5 GB) 19/09/16 12:17:35 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB) 19/09/16 12:17:35 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory on localhost:41091 (size: 12.4 KB, free: 13.5 GB) 19/09/16 12:17:35 INFO SparkContext: Created broadcast 129 from broadcast at DAGScheduler.scala:1161 19/09/16 12:17:35 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 131 (EmptyOutputSink_0 MapPartitionsRDD[887] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0)) 19/09/16 12:17:35 INFO TaskSchedulerImpl: Adding task set 131.0 with 1 tasks 19/09/16 12:17:35 INFO TaskSetManager: Starting task 0.0 in stage 131.0 (TID 158, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes) 19/09/16 12:17:35 INFO Executor: Running task 0.0 in stage 131.0 (TID 158) 19/09/16 12:17:36 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks 19/09/16 12:17:36 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms 19/09/16 12:17:36 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST 19/09/16 12:17:36 INFO BeamFileSystemArtifactRetrievalService: GetManifest for /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST -> 0 artifacts 19/09/16 12:17:38 INFO GrpcLoggingService: Beam Fn Logging client connected. 19/09/16 12:17:38 INFO main: Logging handler created. 19/09/16 12:17:38 INFO main: semi_persistent_directory: /tmp 19/09/16 12:17:38 WARN _load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/09/16 12:17:38 WARN get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1568636233.77_8985134c-6ece-4723-8d4d-af93e9d0d55a', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check'] 19/09/16 12:17:38 INFO main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1568636233.77', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint': u'localhost:47445'} 19/09/16 12:17:38 INFO __init__: Creating insecure control channel for localhost:41127. 19/09/16 12:17:38 INFO start: Status HTTP server running at localhost:37601 19/09/16 12:17:38 INFO __init__: Control channel established. 19/09/16 12:17:38 INFO __init__: Initializing SDKHarness with 12 workers. 19/09/16 12:17:38 INFO FnApiControlClientPoolService: Beam Fn Control client connected with id 161-1 19/09/16 12:17:38 INFO create_state_handler: Creating insecure state channel for localhost:35451. 19/09/16 12:17:38 INFO create_state_handler: State channel established. 19/09/16 12:17:39 INFO create_data_channel: Creating client data channel for localhost:45059 19/09/16 12:17:39 INFO GrpcDataService: Beam Fn Data client connected. 19/09/16 12:17:39 INFO DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/09/16 12:17:39 INFO run: No more requests from control plane 19/09/16 12:17:39 INFO run: SDK Harness waiting for in-flight requests to complete 19/09/16 12:17:39 INFO close: Closing all cached grpc data channels. 19/09/16 12:17:39 INFO close: Closing all cached gRPC state handlers. 19/09/16 12:17:39 INFO run: Done consuming work. 19/09/16 12:17:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:39 INFO main: Python sdk harness exiting. 19/09/16 12:17:39 INFO GrpcLoggingService: Logging client hanged up. 19/09/16 12:17:39 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/09/16 12:17:39 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158). 11970 bytes result sent to driver 19/09/16 12:17:39 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID 158) in 3934 ms on localhost (executor driver) (1/1) 19/09/16 12:17:39 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks have all completed, from pool 19/09/16 12:17:39 INFO DAGScheduler: ResultStage 131 (foreach at BoundedDataset.java:124) finished in 3.962 s 19/09/16 12:17:39 INFO DAGScheduler: Job 47 finished: foreach at BoundedDataset.java:124, took 23.215366 s 19/09/16 12:17:39 INFO SparkPipelineRunner: Job test_windowing_1568636233.77_8985134c-6ece-4723-8d4d-af93e9d0d55a finished. 19/09/16 12:17:39 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner. 19/09/16 12:17:39 INFO BeamFileSystemArtifactRetrievalService: Manifest at /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/MANIFEST has 0 artifact locations 19/09/16 12:17:39 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestEpGCVv/job_fe4af1a1-ca76-4dba-8f6d-9f6bd9f55308/ INFO:root:Job state changed to DONE .==================== Timed out after 60 seconds. ==================== # Thread: <Thread(wait_until_finish_read, started daemon 140312453064448)> # Thread: <Thread(Thread-39, started daemon 140312125331200)> # Thread: <_MainThread(MainThread, started 140312921802496)> ====================================================================== ERROR: test_flattened_side_input (__main__.SparkRunnerTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/portability/spark_runner_test.py",> line 128, in test_flattened_side_input with_transcoding=False) File "apache_beam/runners/portability/fn_api_runner_test.py", line 236, in test_flattened_side_input label='CheckFlattenOfSideInput') File "apache_beam/pipeline.py", line 427, in __exit__ self.run().wait_until_finish() File "apache_beam/runners/portability/portable_runner.py", line 429, in wait_until_finish for state_response in self._state_stream: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 367, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 349, in _next self._state.condition.wait() File "/usr/lib/python2.7/threading.py", line 340, in wait waiter.acquire() File "apache_beam/runners/portability/portable_runner_test.py", line 71, in handler raise BaseException(msg) BaseException: Timed out after 60 seconds. ---------------------------------------------------------------------- Ran 38 tests in 615.101s FAILED (errors=1, skipped=10) > Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 186 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 12m 34s 60 actionable tasks: 47 executed, 13 from cache Publishing build scan... https://gradle.com/s/noqc4dnb75wvg Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org