See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/3108/display/redirect>
Changes: ------------------------------------------ [...truncated 618.66 KB...] 20/05/02 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125: semi_persistent_directory: /tmp 20/05/02 18:13:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 20/05/02 18:13:35 WARN apache_beam/options/pipeline_options.py:309: Discarding unparseable args: [u'--app_name=test_windowed_pardo_state_timers_1588443211.31_fee1c76f-f227-4755-a6b6-643129fe0e65', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--options_id=29', u'--enable_spark_metric_sinks'] 20/05/02 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowed_pardo_state_timers_1588443211.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'spark_master_url': u'local', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35345', 'job_port': u'0'} 20/05/02 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79: Status HTTP server running at localhost:41405 20/05/02 18:13:35 INFO apache_beam/runners/worker/statecache.py:154: Creating state cache with size 0 20/05/02 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating insecure control channel for localhost:45487. 20/05/02 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:156: Control channel established. 20/05/02 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:195: Initializing SDKHarness with unbounded number of workers. 20/05/02 18:13:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 29-1 20/05/02 18:13:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-2 20/05/02 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating insecure state channel for localhost:37589. 20/05/02 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:709: State channel established. 20/05/02 18:13:36 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for 20/05/02 18:13:36 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to read inputs in the data plane. Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1588443216.154426997","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588443216.154422536","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588443216.154420861","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588443216.154410304","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588443216.154387508","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1588443216.154426997","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588443216.154422536","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588443216.154420861","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588443216.154410304","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588443216.154387508","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > 20/05/02 18:13:36 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for localhost:34587 20/05/02 18:13:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 20/05/02 18:13:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-3 20/05/02 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-4 20/05/02 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 29-5 20/05/02 18:13:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1588443211.31_fee1c76f-f227-4755-a6b6-643129fe0e65 finished. INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE .WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter. INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, we will try to pull from hub.docker.com INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f7f793c37d0> ==================== 20/05/02 18:13:40 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1588443217.68_634b72e6-ad29-496c-b687-eda0d72445d1 20/05/02 18:13:40 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1588443217.68_634b72e6-ad29-496c-b687-eda0d72445d1 INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING 20/05/02 18:13:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath 20/05/02 18:13:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 7 files. (Enable logging at DEBUG level to see which files will be staged.) 20/05/02 18:13:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1588443217.68_634b72e6-ad29-496c-b687-eda0d72445d1 on Spark master local 20/05/02 18:13:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> capabilities: "beam:coder:varint:v1" capabilities: "beam:coder:bytes:v1" capabilities: "beam:coder:timer:v1" capabilities: "beam:coder:global_window:v1" capabilities: "beam:coder:interval_window:v1" capabilities: "beam:coder:iterable:v1" capabilities: "beam:coder:state_backed_iterable:v1" capabilities: "beam:coder:windowed_value:v1" capabilities: "beam:coder:param_windowed_value:v1" capabilities: "beam:coder:double:v1" capabilities: "beam:coder:string_utf8:v1" capabilities: "beam:coder:length_prefix:v1" capabilities: "beam:coder:bool:v1" capabilities: "beam:coder:kv:v1" capabilities: "beam:coder:row:v1" capabilities: "beam:protocol:progress_reporting:v0" capabilities: "beam:protocol:worker_status:v1" capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.22.0.dev" 20/05/02 18:13:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown. 20/05/02 18:13:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 20/05/02 18:13:41 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners. 20/05/02 18:13:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1588443217.68_634b72e6-ad29-496c-b687-eda0d72445d1: Pipeline translated successfully. Computing outputs 20/05/02 18:13:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 20/05/02 18:13:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:98: Logging handler created. 20/05/02 18:13:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:125: semi_persistent_directory: /tmp 20/05/02 18:13:45 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:240: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 20/05/02 18:13:45 WARN apache_beam/options/pipeline_options.py:309: Discarding unparseable args: [u'--app_name=test_windowing_1588443217.68_634b72e6-ad29-496c-b687-eda0d72445d1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--options_id=30', u'--enable_spark_metric_sinks'] 20/05/02 18:13:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:79: Status HTTP server running at localhost:37345 20/05/02 18:13:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:138: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'10000', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1588443217.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'spark_master_url': u'local', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35345', 'job_port': u'0'} 20/05/02 18:13:45 INFO apache_beam/runners/worker/statecache.py:154: Creating state cache with size 0 20/05/02 18:13:45 INFO apache_beam/runners/worker/sdk_worker.py:148: Creating insecure control channel for localhost:38477. 20/05/02 18:13:45 INFO apache_beam/runners/worker/sdk_worker.py:156: Control channel established. 20/05/02 18:13:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 30-1 20/05/02 18:13:45 INFO apache_beam/runners/worker/sdk_worker.py:195: Initializing SDKHarness with unbounded number of workers. 20/05/02 18:13:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-2 20/05/02 18:13:45 INFO apache_beam/runners/worker/sdk_worker.py:702: Creating insecure state channel for localhost:33131. 20/05/02 18:13:45 INFO apache_beam/runners/worker/sdk_worker.py:709: State channel established. 20/05/02 18:13:45 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for 20/05/02 18:13:45 ERROR apache_beam/runners/worker/data_plane.py:535: Failed to read inputs in the data plane. Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1588443225.147681928","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588443225.147676883","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588443225.147675109","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588443225.147664410","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588443225.147640297","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 689, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "DNS resolution failed" debug_error_string = "{"created":"@1588443225.147681928","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3981,"referenced_errors":[{"created":"@1588443225.147676883","description":"Resolver transient failure","file":"src/core/ext/filters/client_channel/resolving_lb_policy.cc","file_line":214,"referenced_errors":[{"created":"@1588443225.147675109","description":"DNS resolution failed","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/dns_resolver_ares.cc","file_line":357,"grpc_status":14,"referenced_errors":[{"created":"@1588443225.147664410","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244,"referenced_errors":[{"created":"@1588443225.147640297","description":"C-ares status is not ARES_SUCCESS: Misformatted domain name","file":"src/core/ext/filters/client_channel/resolver/dns/c_ares/grpc_ares_wrapper.cc","file_line":244}]}]}]}]}" > 20/05/02 18:13:45 INFO apache_beam/runners/worker/data_plane.py:634: Creating client data channel for localhost:39591 20/05/02 18:13:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 20/05/02 18:13:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-3 20/05/02 18:13:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-4 20/05/02 18:13:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-5 20/05/02 18:13:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 30-6 20/05/02 18:13:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1588443217.68_634b72e6-ad29-496c-b687-eda0d72445d1 finished. INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE . ====================================================================== ERROR: test_multimap_multiside_input (__main__.SparkRunnerTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "apache_beam/runners/portability/fn_api_runner/fn_runner_test.py", line 265, in test_multimap_multiside_input equal_to([('a', [1, 3], [1, 2, 3]), ('b', [2], [1, 2, 3])])) File "apache_beam/pipeline.py", line 543, in __exit__ self.run().wait_until_finish() File "apache_beam/runners/portability/portable_runner.py", line 568, in wait_until_finish (self._job_id, self._state, self._last_error_message())) RuntimeError: Pipeline test_multimap_multiside_input_1588443000.75_7ca18e2e-2b1c-4eba-bd9e-50d005306297 failed in state FAILED: java.lang.IllegalArgumentException: Multiple entries with same key: ref_PCollection_PCollection_21=(Broadcast(37),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder)) and ref_PCollection_PCollection_21=(Broadcast(36),WindowedValue$FullWindowedValueCoder(KvCoder(ByteArrayCoder,VarLongCoder),GlobalWindow$Coder)) ====================================================================== ERROR: test_pardo_timers (__main__.SparkRunnerTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "apache_beam/runners/portability/fn_api_runner/fn_runner_test.py", line 381, in test_pardo_timers assert_that(actual, equal_to(expected)) File "apache_beam/pipeline.py", line 543, in __exit__ self.run().wait_until_finish() File "apache_beam/runners/portability/portable_runner.py", line 568, in wait_until_finish (self._job_id, self._state, self._last_error_message())) RuntimeError: Pipeline test_pardo_timers_1588443160.8_723f8848-1338-447e-882f-6b4c4bc39e26 failed in state FAILED: java.lang.RuntimeException: Error received from SDK harness for instruction 8: Traceback (most recent call last): File "apache_beam/runners/worker/sdk_worker.py", line 245, in _execute response = task() File "apache_beam/runners/worker/sdk_worker.py", line 302, in <lambda> lambda: self.create_worker().do_instruction(request), request) File "apache_beam/runners/worker/sdk_worker.py", line 471, in do_instruction getattr(request, request_type), request.instruction_id) File "apache_beam/runners/worker/sdk_worker.py", line 506, in process_bundle bundle_processor.process_bundle(instruction_id)) File "apache_beam/runners/worker/bundle_processor.py", line 913, in process_bundle element.data) File "apache_beam/runners/worker/bundle_processor.py", line 217, in process_encoded self.output(decoded_value) File "apache_beam/runners/worker/operations.py", line 332, in output cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value) File "apache_beam/runners/worker/operations.py", line 195, in receive self.consumer.process(windowed_value) File "apache_beam/runners/worker/operations.py", line 671, in process delayed_application = self.dofn_runner.process(o) File "apache_beam/runners/common.py", line 963, in process self._reraise_augmented(exn) File "apache_beam/runners/common.py", line 961, in process return self.do_fn_invoker.invoke_process(windowed_value) File "apache_beam/runners/common.py", line 727, in invoke_process windowed_value, additional_args, additional_kwargs) File "apache_beam/runners/common.py", line 815, in _invoke_process_per_window self.threadsafe_watermark_estimator) File "apache_beam/runners/common.py", line 1122, in process_outputs self.main_receivers.receive(windowed_value) File "apache_beam/runners/worker/operations.py", line 195, in receive self.consumer.process(windowed_value) File "apache_beam/runners/worker/operations.py", line 671, in process delayed_application = self.dofn_runner.process(o) File "apache_beam/runners/common.py", line 963, in process self._reraise_augmented(exn) File "apache_beam/runners/common.py", line 961, in process return self.do_fn_invoker.invoke_process(windowed_value) File "apache_beam/runners/common.py", line 554, in invoke_process windowed_value, self.process_method(windowed_value.value)) File "apache_beam/runners/common.py", line 1122, in process_outputs self.main_receivers.receive(windowed_value) File "apache_beam/runners/worker/operations.py", line 195, in receive self.consumer.process(windowed_value) File "apache_beam/runners/worker/operations.py", line 671, in process delayed_application = self.dofn_runner.process(o) File "apache_beam/runners/common.py", line 963, in process self._reraise_augmented(exn) File "apache_beam/runners/common.py", line 1045, in _reraise_augmented raise_with_traceback(new_exn) File "apache_beam/runners/common.py", line 961, in process return self.do_fn_invoker.invoke_process(windowed_value) File "apache_beam/runners/common.py", line 727, in invoke_process windowed_value, additional_args, additional_kwargs) File "apache_beam/runners/common.py", line 814, in _invoke_process_per_window self.process_method(*args_for_process), File "apache_beam/transforms/core.py", line 1509, in <lambda> wrapper = lambda x, *args, **kwargs: [fn(x, *args, **kwargs)] File "apache_beam/testing/util.py", line 202, in _equal raise BeamAssertException(msg) BeamAssertException: Failed assert: [('fired', 20), ('fired', 200), ('fired', 40), ('fired', 400)] == [('fired', Timestamp(20)), ('fired', Timestamp(200))], missing elements [('fired', 40), ('fired', 400)] [while running 'assert_that/Match'] ---------------------------------------------------------------------- Ran 40 tests in 289.798s FAILED (errors=2, skipped=11) > Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 215 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 7m 10s 62 actionable tasks: 48 executed, 14 from cache Publishing build scan... https://gradle.com/s/glchza6okb6f6 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
