See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/3032/display/redirect?page=changes>
Changes: [github] Merge pull request #11489 [BEAM-9577] Ensure required directories exist ------------------------------------------ [...truncated 574.95 KB...] capabilities: "beam:coder:length_prefix:v1" capabilities: "beam:coder:bool:v1" capabilities: "beam:coder:kv:v1" capabilities: "beam:coder:row:v1" capabilities: "beam:protocol:progress_reporting:v0" capabilities: "beam:protocol:worker_status:v1" capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev" 20/04/22 17:31:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown. 20/04/22 17:31:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter. INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, we will try to pull from hub.docker.com INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f164128fed8> ==================== ERROR:grpc._channel:Exception iterating requests! Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 195, in consume_request_iterator request = next(request_iterator) File "apache_beam/runners/portability/artifact_service.py", line 316, in __next__ raise self._queue.get() _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNIMPLEMENTED details = "Method not found: org.apache.beam.model.job_management.v1.ArtifactStagingService/ReverseArtifactRetrievalService" debug_error_string = "{"created":"@1587576711.990065824","description":"Error received from peer ipv4:127.0.0.1:33265","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Method not found: org.apache.beam.model.job_management.v1.ArtifactStagingService/ReverseArtifactRetrievalService","grpc_status":12}" > 20/04/22 17:31:51 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowed_pardo_state_timers_1587576710.95_652f6de8-bc6b-48ff-8ee5-f51dc9b1959d 20/04/22 17:31:52 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowed_pardo_state_timers_1587576710.95_652f6de8-bc6b-48ff-8ee5-f51dc9b1959d INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING 20/04/22 17:31:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath 20/04/22 17:31:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 7 files. (Enable logging at DEBUG level to see which files will be staged.) 20/04/22 17:31:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowed_pardo_state_timers_1587576710.95_652f6de8-bc6b-48ff-8ee5-f51dc9b1959d on Spark master local 20/04/22 17:31:52 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners. 20/04/22 17:31:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1587576710.95_652f6de8-bc6b-48ff-8ee5-f51dc9b1959d: Pipeline translated successfully. Computing outputs 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 20/04/22 17:31:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98: Logging handler created. 20/04/22 17:31:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79: Status HTTP server running at localhost:46825 20/04/22 17:31:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125: semi_persistent_directory: /tmp 20/04/22 17:31:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 20/04/22 17:31:53 WARN apache_beam/options/pipeline_options.py:309: Discarding unparseable args: [u'--app_name=test_windowed_pardo_state_timers_1587576710.95_652f6de8-bc6b-48ff-8ee5-f51dc9b1959d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--options_id=29', u'--enable_spark_metric_sinks'] 20/04/22 17:31:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowed_pardo_state_timers_1587576710.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'spark_master_url': u'local', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57531', 'job_port': u'0'} 20/04/22 17:31:53 INFO apache_beam/runners/worker/statecache.py:154: Creating state cache with size 0 20/04/22 17:31:53 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating insecure control channel for localhost:34241. 20/04/22 17:31:53 INFO apache_beam/runners/worker/sdk_worker.py:156: Control channel established. 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 29-1 20/04/22 17:31:53 INFO apache_beam/runners/worker/sdk_worker.py:195: Initializing SDKHarness with unbounded number of workers. 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-2 20/04/22 17:31:53 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating insecure state channel for localhost:34789. 20/04/22 17:31:53 INFO apache_beam/runners/worker/sdk_worker.py:709: State channel established. 20/04/22 17:31:53 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for 20/04/22 17:31:53 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to read inputs in the data plane. Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1587576713.041588086","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587576713.041582947","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587576713.041581434","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587576713.041572607","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587576713.041546498","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1587576713.041588086","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587576713.041582947","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587576713.041581434","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587576713.041572607","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587576713.041546498","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > 20/04/22 17:31:53 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for localhost:45129 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-3 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-4 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-5 20/04/22 17:31:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1587576710.95_652f6de8-bc6b-48ff-8ee5-f51dc9b1959d finished. 20/04/22 17:31:53 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemLegacyArtifactStagingService: Removed dir /tmp/sparktestNQhTva/job_ce25857f-35df-42c6-8a12-b864b3aa9e97/ INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE .20/04/22 17:31:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> capabilities: "beam:coder:varint:v1" capabilities: "beam:coder:bytes:v1" capabilities: "beam:coder:timer:v1" capabilities: "beam:coder:global_window:v1" capabilities: "beam:coder:interval_window:v1" capabilities: "beam:coder:iterable:v1" capabilities: "beam:coder:state_backed_iterable:v1" capabilities: "beam:coder:windowed_value:v1" capabilities: "beam:coder:param_windowed_value:v1" capabilities: "beam:coder:double:v1" capabilities: "beam:coder:string_utf8:v1" capabilities: "beam:coder:length_prefix:v1" capabilities: "beam:coder:bool:v1" capabilities: "beam:coder:kv:v1" capabilities: "beam:coder:row:v1" capabilities: "beam:protocol:progress_reporting:v0" capabilities: "beam:protocol:worker_status:v1" capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev" 20/04/22 17:31:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown. 20/04/22 17:31:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter. INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, we will try to pull from hub.docker.com INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f164128fed8> ==================== ERROR:grpc._channel:Exception iterating requests! Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 195, in consume_request_iterator request = next(request_iterator) File "apache_beam/runners/portability/artifact_service.py", line 316, in __next__ raise self._queue.get() _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNIMPLEMENTED details = "Method not found: org.apache.beam.model.job_management.v1.ArtifactStagingService/ReverseArtifactRetrievalService" debug_error_string = "{"created":"@1587576714.835187916","description":"Error received from peer ipv4:127.0.0.1:33265","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Method not found: org.apache.beam.model.job_management.v1.ArtifactStagingService/ReverseArtifactRetrievalService","grpc_status":12}" > 20/04/22 17:31:54 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1587576713.61_0f38a23e-4a58-4e76-a84e-952166ad9add 20/04/22 17:31:54 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1587576713.61_0f38a23e-4a58-4e76-a84e-952166ad9add INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING 20/04/22 17:31:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath 20/04/22 17:31:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 7 files. (Enable logging at DEBUG level to see which files will be staged.) 20/04/22 17:31:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1587576713.61_0f38a23e-4a58-4e76-a84e-952166ad9add on Spark master local 20/04/22 17:31:55 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners. 20/04/22 17:31:55 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1587576713.61_0f38a23e-4a58-4e76-a84e-952166ad9add: Pipeline translated successfully. Computing outputs 20/04/22 17:31:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 20/04/22 17:31:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98: Logging handler created. 20/04/22 17:31:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79: Status HTTP server running at localhost:43033 20/04/22 17:31:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125: semi_persistent_directory: /tmp 20/04/22 17:31:55 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 20/04/22 17:31:55 WARN apache_beam/options/pipeline_options.py:309: Discarding unparseable args: [u'--app_name=test_windowing_1587576713.61_0f38a23e-4a58-4e76-a84e-952166ad9add', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--options_id=30', u'--enable_spark_metric_sinks'] 20/04/22 17:31:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1587576713.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'spark_master_url': u'local', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57531', 'job_port': u'0'} 20/04/22 17:31:55 INFO apache_beam/runners/worker/statecache.py:154: Creating state cache with size 0 20/04/22 17:31:55 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating insecure control channel for localhost:33479. 20/04/22 17:31:55 INFO apache_beam/runners/worker/sdk_worker.py:156: Control channel established. 20/04/22 17:31:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 30-1 20/04/22 17:31:55 INFO apache_beam/runners/worker/sdk_worker.py:195: Initializing SDKHarness with unbounded number of workers. 20/04/22 17:31:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-2 20/04/22 17:31:55 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating insecure state channel for localhost:43171. 20/04/22 17:31:55 INFO apache_beam/runners/worker/sdk_worker.py:709: State channel established. 20/04/22 17:31:55 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for 20/04/22 17:31:55 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for localhost:36749 20/04/22 17:31:55 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to read inputs in the data plane. Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1587576715.876107427","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587576715.876102491","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587576715.876100838","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587576715.876091609","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587576715.876065511","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1587576715.876107427","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1587576715.876102491","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1587576715.876100838","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1587576715.876091609","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1587576715.876065511","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > 20/04/22 17:31:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 20/04/22 17:31:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-3 20/04/22 17:31:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-4 20/04/22 17:31:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> capabilities: "beam:coder:varint:v1" capabilities: "beam:coder:bytes:v1" capabilities: "beam:coder:timer:v1" capabilities: "beam:coder:global_window:v1" capabilities: "beam:coder:interval_window:v1" capabilities: "beam:coder:iterable:v1" capabilities: "beam:coder:state_backed_iterable:v1" capabilities: "beam:coder:windowed_value:v1" capabilities: "beam:coder:param_windowed_value:v1" capabilities: "beam:coder:double:v1" capabilities: "beam:coder:string_utf8:v1" capabilities: "beam:coder:length_prefix:v1" capabilities: "beam:coder:bool:v1" capabilities: "beam:coder:kv:v1" capabilities: "beam:coder:row:v1" capabilities: "beam:protocol:progress_reporting:v0" capabilities: "beam:protocol:worker_status:v1" capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev" 20/04/22 17:31:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown. 20/04/22 17:31:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 20/04/22 17:31:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-5 20/04/22 17:31:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-6 20/04/22 17:31:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1587576713.61_0f38a23e-4a58-4e76-a84e-952166ad9add finished. 20/04/22 17:31:56 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemLegacyArtifactStagingService: Removed dir /tmp/sparktestNQhTva/job_bbebfa8d-7cda-4b79-9458-326737e17510/ INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE . ====================================================================== ERROR: test_multimap_multiside_input (__main__.SparkRunnerTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "apache_beam/runners/portability/fn_api_runner/fn_runner_test.py", line 265, in test_multimap_multiside_input equal_to([('a', [1, 3], [1, 2, 3]), ('b', [2], [1, 2, 3])])) File "apache_beam/pipeline.py", line 528, in __exit__ self.run().wait_until_finish() File "apache_beam/runners/portability/portable_runner.py", line 571, in wait_until_finish (self._job_id, self._state, self._last_error_message())) RuntimeError: Pipeline test_multimap_multiside_input_1587576648.13_8c337b7a-66f1-4d48-b8dd-a866a24498ce failed in state FAILED: java.lang.IllegalArgumentException: Multiple entries with same key: ref_PCollection_PCollection_21=(Broadcast(37),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder)) and ref_PCollection_PCollection_21=(Broadcast(36),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder)) ---------------------------------------------------------------------- Ran 40 tests in 112.201s FAILED (errors=1, skipped=11) > Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 248 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 4m 39s 62 actionable tasks: 48 executed, 14 from cache Publishing build scan... https://gradle.com/s/7a64odagzi3sq Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
