See 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/3086/display/redirect?page=changes>

Changes:

[david.moravek] [BEAM-9824] Do not ignore chained Reshuffles on flink batch 
runner.


------------------------------------------
[...truncated 584.71 KB...]
  File "apache_beam/runners/worker/bundle_processor.py", line 217, in 
process_encoded
    self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 332, in output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 763, in process
    delayed_application = self.dofn_runner.process_with_sized_restriction(o)
  File "apache_beam/runners/common.py", line 974, in 
process_with_sized_restriction
    watermark_estimator=watermark_estimator)
  File "apache_beam/runners/common.py", line 711, in invoke_process
    windowed_value, additional_args, additional_kwargs)
  File "apache_beam/runners/common.py", line 818, in _invoke_process_per_window
    self.threadsafe_restriction_tracker.check_done()
  File "apache_beam/runners/sdf_utils.py", line 115, in check_done
    return self._restriction_tracker.check_done()
  File "apache_beam/io/restriction_trackers.py", line 101, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 6) has 
not been claimed.

INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
.sssWARNING:root:Make sure that locally built Python SDK docker image has 
Python 2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f542bc32140> ====================
ERROR:grpc._channel:Exception iterating requests!
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 195, in consume_request_iterator
    request = next(request_iterator)
  File "apache_beam/runners/portability/artifact_service.py", line 316, in 
__next__
    raise self._queue.get()
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
        status = StatusCode.UNIMPLEMENTED
        details = "Method not found: 
org.apache.beam.model.job_management.v1.ArtifactStagingService/ReverseArtifactRetrievalService"
        debug_error_string = 
"{"created":"@1588059651.105077966","description":"Error received from peer 
ipv4:127.0.0.1:38121","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Method
 not found: 
org.apache.beam.model.job_management.v1.ArtifactStagingService/ReverseArtifactRetrievalService","grpc_status":12}"
>
20/04/28 07:40:51 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job 
test_windowed_pardo_state_timers_1588059650.33_e04b8c76-fa7f-4698-b862-77b760f58dc4
20/04/28 07:40:51 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
test_windowed_pardo_state_timers_1588059650.33_e04b8c76-fa7f-4698-b862-77b760f58dc4
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/04/28 07:40:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/04/28 07:40:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 7 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/04/28 07:40:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
test_windowed_pardo_state_timers_1588059650.33_e04b8c76-fa7f-4698-b862-77b760f58dc4
 on Spark master local
20/04/28 07:40:51 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
20/04/28 07:40:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowed_pardo_state_timers_1588059650.33_e04b8c76-fa7f-4698-b862-77b760f58dc4:
 Pipeline translated successfully. Computing outputs
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging 
client connected.
20/04/28 07:40:52 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98:
 Logging handler created.
20/04/28 07:40:52 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79:
 Status HTTP server running at localhost:36311
20/04/28 07:40:52 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125:
 semi_persistent_directory: /tmp
20/04/28 07:40:52 WARN 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240:
 No session file found: /tmp/staged/pickled_main_session. Functions defined in 
__main__ (interactive session) may fail. 
20/04/28 07:40:52 WARN apache_beam/options/pipeline_options.py:309: Discarding 
unparseable args: 
[u'--app_name=test_windowed_pardo_state_timers_1588059650.33_e04b8c76-fa7f-4698-b862-77b760f58dc4',
 u'--job_server_timeout=60', u'--pipeline_type_check', 
u'--direct_runner_use_stacked_bundle', u'--options_id=29', 
u'--enable_spark_metric_sinks'] 
20/04/28 07:40:52 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138:
 Python sdk harness started with pipeline_options: {'runner': u'None', 
'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000', 
'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': 
u'container', 'job_name': u'test_windowed_pardo_state_timers_1588059650.33', 
'environment_config': u'{"command": 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
 'expansion_port': u'0', 'spark_master_url': u'local', 
'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48125', 'job_port': 
u'0'}
20/04/28 07:40:52 INFO apache_beam/runners/worker/statecache.py:154: Creating 
state cache with size 0
20/04/28 07:40:52 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating 
insecure control channel for localhost:45857.
20/04/28 07:40:52 INFO apache_beam/runners/worker/sdk_worker.py:156: Control 
channel established.
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 29-1
20/04/28 07:40:52 INFO apache_beam/runners/worker/sdk_worker.py:195: 
Initializing SDKHarness with unbounded number of workers.
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 29-2
20/04/28 07:40:52 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating 
insecure state channel for localhost:38837.
20/04/28 07:40:52 INFO apache_beam/runners/worker/sdk_worker.py:709: State 
channel established.
20/04/28 07:40:52 INFO apache_beam/runners/worker/data_plane.py:634: Creating 
client data channel for 
20/04/28 07:40:52 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to 
read inputs in the data plane.
Traceback (most recent call last):
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 413, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 689, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "DNS resolution failed"
        debug_error_string = 
"{"created":"@1588059652.069473802","description":"Failed to pick 
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588059652.069470438","description":"Resolver
 transient 
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588059652.069469323","description":"DNS
 resolution 
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588059652.069463281","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588059652.069447714","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
> Traceback (most recent call last):
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 413, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 689, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "DNS resolution failed"
        debug_error_string = 
"{"created":"@1588059652.069473802","description":"Failed to pick 
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588059652.069470438","description":"Resolver
 transient 
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588059652.069469323","description":"DNS
 resolution 
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588059652.069463281","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588059652.069447714","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
>

20/04/28 07:40:52 INFO apache_beam/runners/worker/data_plane.py:634: Creating 
client data channel for localhost:45109
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 29-3
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing 
environment urn: "beam:env:process:v1"
payload: 
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh";>
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev"

20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn 
Logging clients still connected during shutdown.
20/04/28 07:40:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: 
Hanged up for unknown endpoint.
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 29-4
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 29-5
20/04/28 07:40:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowed_pardo_state_timers_1588059650.33_e04b8c76-fa7f-4698-b862-77b760f58dc4
 finished.
20/04/28 07:40:52 INFO 
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemLegacyArtifactStagingService:
 Removed dir /tmp/sparktest7kYV84/job_71329127-0a2a-412b-bf7a-80f40b56a34f/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f542bc32140> ====================
20/04/28 07:40:53 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job test_windowing_1588059652.62_f9b7dd30-724a-4cb1-80f8-fa02f5a0d897
20/04/28 07:40:53 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation test_windowing_1588059652.62_f9b7dd30-724a-4cb1-80f8-fa02f5a0d897
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
20/04/28 07:40:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
20/04/28 07:40:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 7 files. (Enable logging at DEBUG level to see which files will be 
staged.)
20/04/28 07:40:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job test_windowing_1588059652.62_f9b7dd30-724a-4cb1-80f8-fa02f5a0d897 
on Spark master local
20/04/28 07:40:53 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.
20/04/28 07:40:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowing_1588059652.62_f9b7dd30-724a-4cb1-80f8-fa02f5a0d897: Pipeline 
translated successfully. Computing outputs
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging 
client connected.
20/04/28 07:40:54 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98:
 Logging handler created.
20/04/28 07:40:54 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79:
 Status HTTP server running at localhost:35997
20/04/28 07:40:54 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125:
 semi_persistent_directory: /tmp
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing 
environment urn: "beam:env:process:v1"
payload: 
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh";>
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev"

20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn 
Logging clients still connected during shutdown.
20/04/28 07:40:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: 
Hanged up for unknown endpoint.
20/04/28 07:40:54 WARN 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240:
 No session file found: /tmp/staged/pickled_main_session. Functions defined in 
__main__ (interactive session) may fail. 
20/04/28 07:40:54 WARN apache_beam/options/pipeline_options.py:309: Discarding 
unparseable args: 
[u'--app_name=test_windowing_1588059652.62_f9b7dd30-724a-4cb1-80f8-fa02f5a0d897',
 u'--job_server_timeout=60', u'--pipeline_type_check', 
u'--direct_runner_use_stacked_bundle', u'--options_id=30', 
u'--enable_spark_metric_sinks'] 
20/04/28 07:40:54 INFO 
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138:
 Python sdk harness started with pipeline_options: {'runner': u'None', 
'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000', 
'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': 
u'container', 'job_name': u'test_windowing_1588059652.62', 
'environment_config': u'{"command": 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
 'expansion_port': u'0', 'spark_master_url': u'local', 
'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48125', 'job_port': 
u'0'}
20/04/28 07:40:54 INFO apache_beam/runners/worker/statecache.py:154: Creating 
state cache with size 0
20/04/28 07:40:54 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating 
insecure control channel for localhost:45037.
20/04/28 07:40:54 INFO apache_beam/runners/worker/sdk_worker.py:156: Control 
channel established.
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 30-1
20/04/28 07:40:54 INFO apache_beam/runners/worker/sdk_worker.py:195: 
Initializing SDKHarness with unbounded number of workers.
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 30-2
20/04/28 07:40:54 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating 
insecure state channel for localhost:35531.
20/04/28 07:40:54 INFO apache_beam/runners/worker/sdk_worker.py:709: State 
channel established.
20/04/28 07:40:54 INFO apache_beam/runners/worker/data_plane.py:634: Creating 
client data channel for 
20/04/28 07:40:54 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to 
read inputs in the data plane.
Traceback (most recent call last):
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 413, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 689, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "DNS resolution failed"
        debug_error_string = 
"{"created":"@1588059654.746474955","description":"Failed to pick 
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588059654.746471274","description":"Resolver
 transient 
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588059654.746470099","description":"DNS
 resolution 
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588059654.746464207","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588059654.746449053","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
> Traceback (most recent call last):
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 413, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 689, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "DNS resolution failed"
        debug_error_string = 
"{"created":"@1588059654.746474955","description":"Failed to pick 
subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588059654.746471274","description":"Resolver
 transient 
failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588059654.746470099","description":"DNS
 resolution 
failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588059654.746464207","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588059654.746449053","description":"C-ares
 status is not ARES_SUCCESS: Misformatted domain 
name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}"
>

20/04/28 07:40:54 INFO apache_beam/runners/worker/data_plane.py:634: Creating 
client data channel for localhost:36879
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 30-3
20/04/28 07:40:54 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 30-4
20/04/28 07:40:55 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 30-5
20/04/28 07:40:55 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: 
getProcessBundleDescriptor request with id 30-6
20/04/28 07:40:55 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
test_windowing_1588059652.62_f9b7dd30-724a-4cb1-80f8-fa02f5a0d897 finished.
20/04/28 07:40:55 INFO 
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemLegacyArtifactStagingService:
 Removed dir /tmp/sparktest7kYV84/job_4359059e-2d3f-4fdb-a562-943e115cc28c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_multimap_multiside_input (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner/fn_runner_test.py", line 
265, in test_multimap_multiside_input
    equal_to([('a', [1, 3], [1, 2, 3]), ('b', [2], [1, 2, 3])]))
  File "apache_beam/pipeline.py", line 529, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 571, in 
wait_until_finish
    (self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline 
test_multimap_multiside_input_1588059596.67_dbbcfd85-5a77-46c6-b8db-e0a70a7f3c65
 failed in state FAILED: java.lang.IllegalArgumentException: Multiple entries 
with same key: 
ref_PCollection_PCollection_21=(Broadcast(37),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder))
 and 
ref_PCollection_PCollection_21=(Broadcast(36),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder))

----------------------------------------------------------------------
Ran 40 tests in 88.627s

FAILED (errors=1, skipped=11)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
 line: 248

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 54s
62 actionable tasks: 48 executed, 14 from cache

Publishing build scan...
https://gradle.com/s/ufdzglxrq47bw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to