See
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/3069/display/redirect?page=changes>
Changes:
[github] Fix invalid formatting specifier in verify
[iemejia] [BEAM-9820] Upgrade Flink 1.9.x to 1.9.3
------------------------------------------
[...truncated 569.77 KB...]
File "apache_beam/runners/common.py", line 818, in _invoke_process_per_window
self.threadsafe_restriction_tracker.check_done()
File "apache_beam/runners/sdf_utils.py", line 115, in check_done
return self._restriction_tracker.check_done()
File "apache_beam/io/restriction_trackers.py", line 101, in check_done
self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 6) has
not been claimed.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
.sssWARNING:root:Make sure that locally built Python SDK docker image has
Python 2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7ff660989f50> ====================
20/04/26 21:08:02 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking
job
test_windowed_pardo_state_timers_1587935281.2_2ee13fd5-c9b2-4056-89b9-1a11c1649198
20/04/26 21:08:02 INFO
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job
invocation
test_windowed_pardo_state_timers_1587935281.2_2ee13fd5-c9b2-4056-89b9-1a11c1649198
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
20/04/26 21:08:02 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev"
20/04/26 21:08:02 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn
Logging clients still connected during shutdown.
20/04/26 21:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
20/04/26 21:08:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
PipelineOptions.filesToStage was not specified. Defaulting to files from the
classpath
20/04/26 21:08:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will
stage 7 files. (Enable logging at DEBUG level to see which files will be
staged.)
20/04/26 21:08:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
Running job
test_windowed_pardo_state_timers_1587935281.2_2ee13fd5-c9b2-4056-89b9-1a11c1649198
on Spark master local
20/04/26 21:08:02 WARN
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions:
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not
consistent with equals. That might cause issues on some runners.
20/04/26 21:08:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job
test_windowed_pardo_state_timers_1587935281.2_2ee13fd5-c9b2-4056-89b9-1a11c1649198:
Pipeline translated successfully. Computing outputs
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging
client connected.
20/04/26 21:08:03 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98:
Logging handler created.
20/04/26 21:08:03 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79:
Status HTTP server running at localhost:38243
20/04/26 21:08:03 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125:
semi_persistent_directory: /tmp
20/04/26 21:08:03 WARN
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240:
No session file found: /tmp/staged/pickled_main_session. Functions defined in
__main__ (interactive session) may fail.
20/04/26 21:08:03 WARN apache_beam/options/pipeline_options.py:309: Discarding
unparseable args:
[u'--app_name=test_windowed_pardo_state_timers_1587935281.2_2ee13fd5-c9b2-4056-89b9-1a11c1649198',
u'--job_server_timeout=60', u'--pipeline_type_check',
u'--direct_runner_use_stacked_bundle', u'--options_id=29',
u'--enable_spark_metric_sinks']
20/04/26 21:08:03 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138:
Python sdk harness started with pipeline_options: {'runner': u'None',
'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000',
'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location':
u'container', 'job_name': u'test_windowed_pardo_state_timers_1587935281.2',
'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'expansion_port': u'0', 'spark_master_url': u'local',
'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43675', 'job_port':
u'0'}
20/04/26 21:08:03 INFO apache_beam/runners/worker/statecache.py:154: Creating
state cache with size 0
20/04/26 21:08:03 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating
insecure control channel for localhost:38815.
20/04/26 21:08:03 INFO apache_beam/runners/worker/sdk_worker.py:156: Control
channel established.
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 29-1
20/04/26 21:08:03 INFO apache_beam/runners/worker/sdk_worker.py:195:
Initializing SDKHarness with unbounded number of workers.
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 29-2
20/04/26 21:08:03 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating
insecure state channel for localhost:36963.
20/04/26 21:08:03 INFO apache_beam/runners/worker/sdk_worker.py:709: State
channel established.
20/04/26 21:08:03 INFO apache_beam/runners/worker/data_plane.py:634: Creating
client data channel for
20/04/26 21:08:03 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to
read inputs in the data plane.
Traceback (most recent call last):
File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 413, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 689, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "DNS resolution failed"
debug_error_string =
"{"created":"@1587935283.185410782","description":"Failed to pick
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587935283.185405712","description":"Resolver
transient
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587935283.185403987","description":"DNS
resolution
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587935283.185394017","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587935283.185337919","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
> Traceback (most recent call last):
File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 413, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 689, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "DNS resolution failed"
debug_error_string =
"{"created":"@1587935283.185410782","description":"Failed to pick
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587935283.185405712","description":"Resolver
transient
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587935283.185403987","description":"DNS
resolution
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587935283.185394017","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587935283.185337919","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
>
20/04/26 21:08:03 INFO apache_beam/runners/worker/data_plane.py:634: Creating
client data channel for localhost:35543
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 29-3
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 29-4
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 29-5
20/04/26 21:08:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job
test_windowed_pardo_state_timers_1587935281.2_2ee13fd5-c9b2-4056-89b9-1a11c1649198
finished.
20/04/26 21:08:03 INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemLegacyArtifactStagingService:
Removed dir /tmp/sparktest279fNb/job_e78cb2d3-f19d-48b5-9a5a-9672d3904c7b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.20/04/26 21:08:04 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev"
20/04/26 21:08:04 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn
Logging clients still connected during shutdown.
20/04/26 21:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
WARNING:root:Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7ff660989f50> ====================
20/04/26 21:08:05 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking
job test_windowing_1587935283.76_b1438534-a732-4a0d-87ad-f30882b7c468
20/04/26 21:08:05 INFO
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job
invocation test_windowing_1587935283.76_b1438534-a732-4a0d-87ad-f30882b7c468
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
20/04/26 21:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
PipelineOptions.filesToStage was not specified. Defaulting to files from the
classpath
20/04/26 21:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will
stage 7 files. (Enable logging at DEBUG level to see which files will be
staged.)
20/04/26 21:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
Running job test_windowing_1587935283.76_b1438534-a732-4a0d-87ad-f30882b7c468
on Spark master local
20/04/26 21:08:05 WARN
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions:
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not
consistent with equals. That might cause issues on some runners.
20/04/26 21:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job
test_windowing_1587935283.76_b1438534-a732-4a0d-87ad-f30882b7c468: Pipeline
translated successfully. Computing outputs
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging
client connected.
20/04/26 21:08:06 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98:
Logging handler created.
20/04/26 21:08:06 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79:
Status HTTP server running at localhost:43651
20/04/26 21:08:06 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125:
semi_persistent_directory: /tmp
20/04/26 21:08:06 WARN
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240:
No session file found: /tmp/staged/pickled_main_session. Functions defined in
__main__ (interactive session) may fail.
20/04/26 21:08:06 WARN apache_beam/options/pipeline_options.py:309: Discarding
unparseable args:
[u'--app_name=test_windowing_1587935283.76_b1438534-a732-4a0d-87ad-f30882b7c468',
u'--job_server_timeout=60', u'--pipeline_type_check',
u'--direct_runner_use_stacked_bundle', u'--options_id=30',
u'--enable_spark_metric_sinks']
20/04/26 21:08:06 INFO
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138:
Python sdk harness started with pipeline_options: {'runner': u'None',
'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000',
'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location':
u'container', 'job_name': u'test_windowing_1587935283.76',
'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'expansion_port': u'0', 'spark_master_url': u'local',
'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43675', 'job_port':
u'0'}
20/04/26 21:08:06 INFO apache_beam/runners/worker/statecache.py:154: Creating
state cache with size 0
20/04/26 21:08:06 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating
insecure control channel for localhost:44991.
20/04/26 21:08:06 INFO apache_beam/runners/worker/sdk_worker.py:156: Control
channel established.
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 30-1
20/04/26 21:08:06 INFO apache_beam/runners/worker/sdk_worker.py:195:
Initializing SDKHarness with unbounded number of workers.
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 30-2
20/04/26 21:08:06 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating
insecure state channel for localhost:36033.
20/04/26 21:08:06 INFO apache_beam/runners/worker/sdk_worker.py:709: State
channel established.
20/04/26 21:08:06 INFO apache_beam/runners/worker/data_plane.py:634: Creating
client data channel for
20/04/26 21:08:06 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to
read inputs in the data plane.
Traceback (most recent call last):
File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 413, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 689, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "DNS resolution failed"
debug_error_string =
"{"created":"@1587935286.403728574","description":"Failed to pick
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587935286.403723892","description":"Resolver
transient
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587935286.403722033","description":"DNS
resolution
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587935286.403713743","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587935286.403688371","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
> Traceback (most recent call last):
File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 413, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 689, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "DNS resolution failed"
debug_error_string =
"{"created":"@1587935286.403728574","description":"Failed to pick
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587935286.403723892","description":"Resolver
transient
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587935286.403722033","description":"DNS
resolution
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587935286.403713743","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587935286.403688371","description":"C-ares
status is not ARES_SUCCESS: Misformatted domain
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
>
20/04/26 21:08:06 INFO apache_beam/runners/worker/data_plane.py:634: Creating
client data channel for localhost:43865
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 30-3
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 30-4
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev"
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn
Logging clients still connected during shutdown.
20/04/26 21:08:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 30-5
20/04/26 21:08:06 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 30-6
20/04/26 21:08:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job
test_windowing_1587935283.76_b1438534-a732-4a0d-87ad-f30882b7c468 finished.
20/04/26 21:08:07 INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemLegacyArtifactStagingService:
Removed dir /tmp/sparktest279fNb/job_3eb4728d-0f19-4080-bd86-1513450cb870/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_multimap_multiside_input (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "apache_beam/runners/portability/fn_api_runner/fn_runner_test.py", line
265, in test_multimap_multiside_input
equal_to([('a', [1, 3], [1, 2, 3]), ('b', [2], [1, 2, 3])]))
File "apache_beam/pipeline.py", line 529, in __exit__
self.run().wait_until_finish()
File "apache_beam/runners/portability/portable_runner.py", line 571, in
wait_until_finish
(self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline
test_multimap_multiside_input_1587935219.48_e82e1d99-a02e-45e8-866e-472fbbcae999
failed in state FAILED: java.lang.IllegalArgumentException: Multiple entries
with same key:
ref_PCollection_PCollection_21=(Broadcast(37),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder))
and
ref_PCollection_PCollection_21=(Broadcast(36),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder))
----------------------------------------------------------------------
Ran 40 tests in 101.554s
FAILED (errors=1, skipped=11)
> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
line: 248
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 4m 16s
62 actionable tasks: 48 executed, 14 from cache
Publishing build scan...
https://gradle.com/s/ypwxvfzra7276
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]