See
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/872/display/redirect>
------------------------------------------
[...truncated 1.23 MB...]
19/09/07 00:20:01 INFO close: Closing all cached grpc data channels.
19/09/07 00:20:01 INFO close: Closing all cached gRPC state handlers.
19/09/07 00:20:01 INFO run: Done consuming work.
19/09/07 00:20:01 INFO main: Python sdk harness exiting.
19/09/07 00:20:01 INFO GrpcLoggingService: Logging client hanged up.
19/09/07 00:20:01 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:01 INFO Executor: Finished task 0.0 in stage 128.0 (TID 154).
10234 bytes result sent to driver
19/09/07 00:20:01 INFO TaskSetManager: Finished task 0.0 in stage 128.0 (TID
154) in 3552 ms on localhost (executor driver) (1/1)
19/09/07 00:20:01 INFO TaskSchedulerImpl: Removed TaskSet 128.0, whose tasks
have all completed, from pool
19/09/07 00:20:01 INFO DAGScheduler: ShuffleMapStage 128 (repartition at
GroupCombineFunctions.java:191) finished in 3.572 s
19/09/07 00:20:01 INFO DAGScheduler: looking for newly runnable stages
19/09/07 00:20:01 INFO DAGScheduler: running: Set()
19/09/07 00:20:01 INFO DAGScheduler: waiting: Set(ShuffleMapStage 129,
ShuffleMapStage 130, ResultStage 131)
19/09/07 00:20:01 INFO DAGScheduler: failed: Set()
19/09/07 00:20:01 INFO DAGScheduler: Submitting ShuffleMapStage 129
(MapPartitionsRDD[871] at mapToPair at GroupCombineFunctions.java:55), which
has no missing parents
19/09/07 00:20:01 INFO MemoryStore: Block broadcast_127 stored as values in
memory (estimated size 24.8 KB, free 13.5 GB)
19/09/07 00:20:01 INFO MemoryStore: Block broadcast_127_piece0 stored as bytes
in memory (estimated size 10.8 KB, free 13.5 GB)
19/09/07 00:20:01 INFO BlockManagerInfo: Added broadcast_127_piece0 in memory
on localhost:43481 (size: 10.8 KB, free: 13.5 GB)
19/09/07 00:20:01 INFO SparkContext: Created broadcast 127 from broadcast at
DAGScheduler.scala:1161
19/09/07 00:20:01 INFO DAGScheduler: Submitting 1 missing tasks from
ShuffleMapStage 129 (MapPartitionsRDD[871] at mapToPair at
GroupCombineFunctions.java:55) (first 15 tasks are for partitions Vector(0))
19/09/07 00:20:01 INFO TaskSchedulerImpl: Adding task set 129.0 with 1 tasks
19/09/07 00:20:01 INFO TaskSetManager: Starting task 0.0 in stage 129.0 (TID
155, localhost, executor driver, partition 0, NODE_LOCAL, 7927 bytes)
19/09/07 00:20:01 INFO Executor: Running task 0.0 in stage 129.0 (TID 155)
19/09/07 00:20:02 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST
19/09/07 00:20:02 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST -> 0
artifacts
19/09/07 00:20:04 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/09/07 00:20:04 INFO main: Logging handler created.
19/09/07 00:20:04 INFO main: semi_persistent_directory: /tmp
19/09/07 00:20:04 WARN _load_main_session: No session file found:
/tmp/staged/pickled_main_session. Functions defined in __main__ (interactive
session) may fail.
19/09/07 00:20:04 WARN get_all_options: Discarding unparseable args:
[u'--app_name=test_windowing_1567815593.36_a82e963d-8c14-4319-afce-cc9b23ef0a26',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check']
19/09/07 00:20:04 INFO main: Python sdk harness started with pipeline_options:
{'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'environment_type': u'PROCESS',
'sdk_location': u'container', 'job_name': u'test_windowing_1567815593.36',
'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint':
u'localhost:34695'}
19/09/07 00:20:04 INFO __init__: Creating insecure control channel for
localhost:46681.
19/09/07 00:20:04 INFO start: Status HTTP server running at localhost:35899
19/09/07 00:20:04 INFO __init__: Control channel established.
19/09/07 00:20:04 INFO __init__: Initializing SDKHarness with 12 workers.
19/09/07 00:20:04 INFO FnApiControlClientPoolService: Beam Fn Control client
connected with id 158-1
19/09/07 00:20:04 INFO create_state_handler: Creating insecure state channel
for localhost:41193.
19/09/07 00:20:04 INFO create_state_handler: State channel established.
19/09/07 00:20:04 INFO create_data_channel: Creating client data channel for
localhost:36411
19/09/07 00:20:04 INFO GrpcDataService: Beam Fn Data client connected.
19/09/07 00:20:04 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks
including 1 local blocks and 0 remote blocks
19/09/07 00:20:04 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in
1 ms
19/09/07 00:20:05 INFO DefaultJobBundleFactory: Closing environment urn:
"beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/09/07 00:20:05 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:05 INFO run: No more requests from control plane
19/09/07 00:20:05 INFO run: SDK Harness waiting for in-flight requests to
complete
19/09/07 00:20:05 INFO close: Closing all cached grpc data channels.
19/09/07 00:20:05 INFO close: Closing all cached gRPC state handlers.
19/09/07 00:20:05 INFO run: Done consuming work.
19/09/07 00:20:05 INFO main: Python sdk harness exiting.
19/09/07 00:20:05 INFO GrpcLoggingService: Logging client hanged up.
19/09/07 00:20:05 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:05 INFO Executor: Finished task 0.0 in stage 129.0 (TID 155).
12763 bytes result sent to driver
19/09/07 00:20:05 INFO TaskSetManager: Finished task 0.0 in stage 129.0 (TID
155) in 3215 ms on localhost (executor driver) (1/1)
19/09/07 00:20:05 INFO TaskSchedulerImpl: Removed TaskSet 129.0, whose tasks
have all completed, from pool
19/09/07 00:20:05 INFO DAGScheduler: ShuffleMapStage 129 (mapToPair at
GroupCombineFunctions.java:55) finished in 3.235 s
19/09/07 00:20:05 INFO DAGScheduler: looking for newly runnable stages
19/09/07 00:20:05 INFO DAGScheduler: running: Set()
19/09/07 00:20:05 INFO DAGScheduler: waiting: Set(ShuffleMapStage 130,
ResultStage 131)
19/09/07 00:20:05 INFO DAGScheduler: failed: Set()
19/09/07 00:20:05 INFO DAGScheduler: Submitting ShuffleMapStage 130
(MapPartitionsRDD[882] at flatMapToPair at
GroupNonMergingWindowsFunctions.java:115), which has no missing parents
19/09/07 00:20:05 INFO MemoryStore: Block broadcast_128 stored as values in
memory (estimated size 57.6 KB, free 13.5 GB)
19/09/07 00:20:05 INFO MemoryStore: Block broadcast_128_piece0 stored as bytes
in memory (estimated size 22.1 KB, free 13.5 GB)
19/09/07 00:20:05 INFO BlockManagerInfo: Added broadcast_128_piece0 in memory
on localhost:43481 (size: 22.1 KB, free: 13.5 GB)
19/09/07 00:20:05 INFO SparkContext: Created broadcast 128 from broadcast at
DAGScheduler.scala:1161
19/09/07 00:20:05 INFO DAGScheduler: Submitting 2 missing tasks from
ShuffleMapStage 130 (MapPartitionsRDD[882] at flatMapToPair at
GroupNonMergingWindowsFunctions.java:115) (first 15 tasks are for partitions
Vector(0, 1))
19/09/07 00:20:05 INFO TaskSchedulerImpl: Adding task set 130.0 with 2 tasks
19/09/07 00:20:05 INFO TaskSetManager: Starting task 0.0 in stage 130.0 (TID
156, localhost, executor driver, partition 0, NODE_LOCAL, 7760 bytes)
19/09/07 00:20:05 INFO Executor: Running task 0.0 in stage 130.0 (TID 156)
19/09/07 00:20:05 INFO ShuffleBlockFetcherIterator: Getting 1 non-empty blocks
including 1 local blocks and 0 remote blocks
19/09/07 00:20:05 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in
1 ms
19/09/07 00:20:05 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST
19/09/07 00:20:05 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST -> 0
artifacts
19/09/07 00:20:08 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/09/07 00:20:08 INFO main: Logging handler created.
19/09/07 00:20:08 INFO main: semi_persistent_directory: /tmp
19/09/07 00:20:08 WARN _load_main_session: No session file found:
/tmp/staged/pickled_main_session. Functions defined in __main__ (interactive
session) may fail.
19/09/07 00:20:08 WARN get_all_options: Discarding unparseable args:
[u'--app_name=test_windowing_1567815593.36_a82e963d-8c14-4319-afce-cc9b23ef0a26',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check']
19/09/07 00:20:08 INFO start: Status HTTP server running at localhost:39911
19/09/07 00:20:08 INFO main: Python sdk harness started with pipeline_options:
{'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'environment_type': u'PROCESS',
'sdk_location': u'container', 'job_name': u'test_windowing_1567815593.36',
'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint':
u'localhost:34695'}
19/09/07 00:20:08 INFO __init__: Creating insecure control channel for
localhost:45467.
19/09/07 00:20:08 INFO __init__: Control channel established.
19/09/07 00:20:08 INFO __init__: Initializing SDKHarness with 12 workers.
19/09/07 00:20:08 INFO FnApiControlClientPoolService: Beam Fn Control client
connected with id 159-1
19/09/07 00:20:08 INFO create_state_handler: Creating insecure state channel
for localhost:39293.
19/09/07 00:20:08 INFO create_state_handler: State channel established.
19/09/07 00:20:08 INFO create_data_channel: Creating client data channel for
localhost:36383
19/09/07 00:20:08 INFO GrpcDataService: Beam Fn Data client connected.
19/09/07 00:20:08 INFO DefaultJobBundleFactory: Closing environment urn:
"beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/09/07 00:20:08 INFO run: No more requests from control plane
19/09/07 00:20:08 INFO run: SDK Harness waiting for in-flight requests to
complete
19/09/07 00:20:08 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:08 INFO close: Closing all cached grpc data channels.
19/09/07 00:20:08 INFO close: Closing all cached gRPC state handlers.
19/09/07 00:20:08 INFO run: Done consuming work.
19/09/07 00:20:08 INFO main: Python sdk harness exiting.
19/09/07 00:20:08 INFO GrpcLoggingService: Logging client hanged up.
19/09/07 00:20:08 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:08 INFO Executor: Finished task 0.0 in stage 130.0 (TID 156).
15229 bytes result sent to driver
19/09/07 00:20:08 INFO TaskSetManager: Starting task 1.0 in stage 130.0 (TID
157, localhost, executor driver, partition 1, PROCESS_LOCAL, 7977 bytes)
19/09/07 00:20:08 INFO TaskSetManager: Finished task 0.0 in stage 130.0 (TID
156) in 3675 ms on localhost (executor driver) (1/2)
19/09/07 00:20:08 INFO Executor: Running task 1.0 in stage 130.0 (TID 157)
19/09/07 00:20:09 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST
19/09/07 00:20:09 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST -> 0
artifacts
19/09/07 00:20:11 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/09/07 00:20:11 INFO main: Logging handler created.
19/09/07 00:20:11 INFO main: semi_persistent_directory: /tmp
19/09/07 00:20:11 WARN _load_main_session: No session file found:
/tmp/staged/pickled_main_session. Functions defined in __main__ (interactive
session) may fail.
19/09/07 00:20:11 INFO start: Status HTTP server running at localhost:45685
19/09/07 00:20:11 WARN get_all_options: Discarding unparseable args:
[u'--app_name=test_windowing_1567815593.36_a82e963d-8c14-4319-afce-cc9b23ef0a26',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check']
19/09/07 00:20:11 INFO main: Python sdk harness started with pipeline_options:
{'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'environment_type': u'PROCESS',
'sdk_location': u'container', 'job_name': u'test_windowing_1567815593.36',
'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint':
u'localhost:34695'}
19/09/07 00:20:11 INFO __init__: Creating insecure control channel for
localhost:44083.
19/09/07 00:20:11 INFO __init__: Control channel established.
19/09/07 00:20:11 INFO __init__: Initializing SDKHarness with 12 workers.
19/09/07 00:20:11 INFO FnApiControlClientPoolService: Beam Fn Control client
connected with id 160-1
19/09/07 00:20:11 INFO create_state_handler: Creating insecure state channel
for localhost:40915.
19/09/07 00:20:11 INFO create_state_handler: State channel established.
19/09/07 00:20:11 INFO create_data_channel: Creating client data channel for
localhost:38211
19/09/07 00:20:11 INFO GrpcDataService: Beam Fn Data client connected.
19/09/07 00:20:11 INFO DefaultJobBundleFactory: Closing environment urn:
"beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/09/07 00:20:11 INFO run: No more requests from control plane
19/09/07 00:20:11 INFO run: SDK Harness waiting for in-flight requests to
complete
19/09/07 00:20:11 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:11 INFO close: Closing all cached grpc data channels.
19/09/07 00:20:11 INFO close: Closing all cached gRPC state handlers.
19/09/07 00:20:11 INFO run: Done consuming work.
19/09/07 00:20:11 INFO main: Python sdk harness exiting.
19/09/07 00:20:11 INFO GrpcLoggingService: Logging client hanged up.
19/09/07 00:20:11 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:12 INFO Executor: Finished task 1.0 in stage 130.0 (TID 157).
13710 bytes result sent to driver
19/09/07 00:20:12 INFO TaskSetManager: Finished task 1.0 in stage 130.0 (TID
157) in 3125 ms on localhost (executor driver) (2/2)
19/09/07 00:20:12 INFO TaskSchedulerImpl: Removed TaskSet 130.0, whose tasks
have all completed, from pool
19/09/07 00:20:12 INFO DAGScheduler: ShuffleMapStage 130 (flatMapToPair at
GroupNonMergingWindowsFunctions.java:115) finished in 6.825 s
19/09/07 00:20:12 INFO DAGScheduler: looking for newly runnable stages
19/09/07 00:20:12 INFO DAGScheduler: running: Set()
19/09/07 00:20:12 INFO DAGScheduler: waiting: Set(ResultStage 131)
19/09/07 00:20:12 INFO DAGScheduler: failed: Set()
19/09/07 00:20:12 INFO DAGScheduler: Submitting ResultStage 131
(EmptyOutputSink_0 MapPartitionsRDD[887] at flatMap at
SparkBatchPortablePipelineTranslator.java:309), which has no missing parents
19/09/07 00:20:12 INFO MemoryStore: Block broadcast_129 stored as values in
memory (estimated size 26.4 KB, free 13.5 GB)
19/09/07 00:20:12 INFO MemoryStore: Block broadcast_129_piece0 stored as bytes
in memory (estimated size 12.4 KB, free 13.5 GB)
19/09/07 00:20:12 INFO BlockManagerInfo: Added broadcast_129_piece0 in memory
on localhost:43481 (size: 12.4 KB, free: 13.5 GB)
19/09/07 00:20:12 INFO SparkContext: Created broadcast 129 from broadcast at
DAGScheduler.scala:1161
19/09/07 00:20:12 INFO DAGScheduler: Submitting 1 missing tasks from
ResultStage 131 (EmptyOutputSink_0 MapPartitionsRDD[887] at flatMap at
SparkBatchPortablePipelineTranslator.java:309) (first 15 tasks are for
partitions Vector(0))
19/09/07 00:20:12 INFO TaskSchedulerImpl: Adding task set 131.0 with 1 tasks
19/09/07 00:20:12 INFO TaskSetManager: Starting task 0.0 in stage 131.0 (TID
158, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/09/07 00:20:12 INFO Executor: Running task 0.0 in stage 131.0 (TID 158)
19/09/07 00:20:12 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks
including 2 local blocks and 0 remote blocks
19/09/07 00:20:12 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in
0 ms
19/09/07 00:20:12 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST
19/09/07 00:20:12 INFO BeamFileSystemArtifactRetrievalService: GetManifest for
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST -> 0
artifacts
19/09/07 00:20:14 INFO GrpcLoggingService: Beam Fn Logging client connected.
19/09/07 00:20:14 INFO main: Logging handler created.
19/09/07 00:20:14 INFO main: semi_persistent_directory: /tmp
19/09/07 00:20:14 WARN _load_main_session: No session file found:
/tmp/staged/pickled_main_session. Functions defined in __main__ (interactive
session) may fail.
19/09/07 00:20:14 WARN get_all_options: Discarding unparseable args:
[u'--app_name=test_windowing_1567815593.36_a82e963d-8c14-4319-afce-cc9b23ef0a26',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks', u'--pipeline_type_check']
19/09/07 00:20:14 INFO main: Python sdk harness started with pipeline_options:
{'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'environment_type': u'PROCESS',
'sdk_location': u'container', 'job_name': u'test_windowing_1567815593.36',
'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'region': u'us-central1', 'sdk_worker_parallelism': u'0', 'job_endpoint':
u'localhost:34695'}
19/09/07 00:20:14 INFO __init__: Creating insecure control channel for
localhost:38407.
19/09/07 00:20:14 INFO start: Status HTTP server running at localhost:44673
19/09/07 00:20:14 INFO __init__: Control channel established.
19/09/07 00:20:14 INFO __init__: Initializing SDKHarness with 12 workers.
19/09/07 00:20:14 INFO FnApiControlClientPoolService: Beam Fn Control client
connected with id 161-1
19/09/07 00:20:14 INFO create_state_handler: Creating insecure state channel
for localhost:41829.
19/09/07 00:20:14 INFO create_state_handler: State channel established.
19/09/07 00:20:14 INFO create_data_channel: Creating client data channel for
localhost:41447
19/09/07 00:20:14 INFO GrpcDataService: Beam Fn Data client connected.
19/09/07 00:20:15 INFO DefaultJobBundleFactory: Closing environment urn:
"beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/09/07 00:20:15 INFO run: No more requests from control plane
19/09/07 00:20:15 INFO run: SDK Harness waiting for in-flight requests to
complete
19/09/07 00:20:15 INFO close: Closing all cached grpc data channels.
19/09/07 00:20:15 INFO close: Closing all cached gRPC state handlers.
19/09/07 00:20:15 INFO run: Done consuming work.
19/09/07 00:20:15 INFO main: Python sdk harness exiting.
19/09/07 00:20:15 INFO GrpcLoggingService: Logging client hanged up.
19/09/07 00:20:15 WARN BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint.
19/09/07 00:20:15 INFO Executor: Finished task 0.0 in stage 131.0 (TID 158).
11970 bytes result sent to driver
19/09/07 00:20:15 INFO TaskSetManager: Finished task 0.0 in stage 131.0 (TID
158) in 3115 ms on localhost (executor driver) (1/1)
19/09/07 00:20:15 INFO TaskSchedulerImpl: Removed TaskSet 131.0, whose tasks
have all completed, from pool
19/09/07 00:20:15 INFO DAGScheduler: ResultStage 131 (foreach at
BoundedDataset.java:124) finished in 3.142 s
19/09/07 00:20:15 INFO DAGScheduler: Job 47 finished: foreach at
BoundedDataset.java:124, took 16.780539 s
19/09/07 00:20:15 INFO SparkPipelineRunner: Job
test_windowing_1567815593.36_a82e963d-8c14-4319-afce-cc9b23ef0a26 finished.
19/09/07 00:20:15 WARN SparkPipelineResult$BatchMode: Collecting monitoring
infos is not implemented yet in Spark portable runner.
19/09/07 00:20:15 INFO BeamFileSystemArtifactRetrievalService: Manifest at
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/MANIFEST has 0
artifact locations
19/09/07 00:20:15 INFO BeamFileSystemArtifactStagingService: Removed dir
/tmp/sparktestK77jg1/job_479e9a95-774b-4828-96ac-71854babfab9/
INFO:root:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_side_and_main_outputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "apache_beam/runners/portability/fn_api_runner_test.py", line 177, in
test_pardo_side_and_main_outputs
assert_that(unnamed.odd, equal_to([1, 3]), label='unnamed.odd')
File "apache_beam/pipeline.py", line 427, in __exit__
self.run().wait_until_finish()
File "apache_beam/runners/portability/portable_runner.py", line 434, in
wait_until_finish
for state_response in self._state_stream:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 367, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 349, in _next
self._state.condition.wait()
File "/usr/lib/python2.7/threading.py", line 340, in wait
waiter.acquire()
File "apache_beam/runners/portability/portable_runner_test.py", line 71, in
handler
raise BaseException(msg)
BaseException: Timed out after 60 seconds.
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------
Ran 38 tests in 656.669s
FAILED (errors=1, skipped=10)
# Thread: <_MainThread(MainThread, started 139896596600576)>
> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
line: 168
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 15m 53s
60 actionable tasks: 47 executed, 13 from cache
Publishing build scan...
https://gradle.com/s/tkumvdvqb3jeg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]