[ 
https://issues.apache.org/jira/browse/FLINK-25940?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Huang Xingbo reassigned FLINK-25940:
------------------------------------

    Assignee: Huang Xingbo

> pyflink/datastream/tests/test_data_stream.py::StreamingModeDataStreamTests::test_keyed_process_function_with_state
>  failed on AZP
> --------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: FLINK-25940
>                 URL: https://issues.apache.org/jira/browse/FLINK-25940
>             Project: Flink
>          Issue Type: Bug
>          Components: API / Python
>    Affects Versions: 1.15.0
>            Reporter: Till Rohrmann
>            Assignee: Huang Xingbo
>            Priority: Critical
>              Labels: test-stability
>
> The test 
> {{pyflink/datastream/tests/test_data_stream.py::StreamingModeDataStreamTests::test_keyed_process_function_with_state}}
>  fails on AZP:
> {code}
> 2022-02-02T17:44:12.1898582Z Feb 02 17:44:12 
> =================================== FAILURES 
> ===================================
> 2022-02-02T17:44:12.1899860Z Feb 02 17:44:12 _____ 
> StreamingModeDataStreamTests.test_keyed_process_function_with_state ______
> 2022-02-02T17:44:12.1900493Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1901218Z Feb 02 17:44:12 self = 
> <pyflink.datastream.tests.test_data_stream.StreamingModeDataStreamTests 
> testMethod=test_keyed_process_function_with_state>
> 2022-02-02T17:44:12.1901948Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1902745Z Feb 02 17:44:12     def 
> test_keyed_process_function_with_state(self):
> 2022-02-02T17:44:12.1903722Z Feb 02 17:44:12         
> self.env.get_config().set_auto_watermark_interval(2000)
> 2022-02-02T17:44:12.1904473Z Feb 02 17:44:12         
> self.env.set_stream_time_characteristic(TimeCharacteristic.EventTime)
> 2022-02-02T17:44:12.1906780Z Feb 02 17:44:12         data_stream = 
> self.env.from_collection([(1, 'hi', '1603708211000'),
> 2022-02-02T17:44:12.1908034Z Feb 02 17:44:12                                  
>                (2, 'hello', '1603708224000'),
> 2022-02-02T17:44:12.1909166Z Feb 02 17:44:12                                  
>                (3, 'hi', '1603708226000'),
> 2022-02-02T17:44:12.1910122Z Feb 02 17:44:12                                  
>                (4, 'hello', '1603708289000'),
> 2022-02-02T17:44:12.1911099Z Feb 02 17:44:12                                  
>                (5, 'hi', '1603708291000'),
> 2022-02-02T17:44:12.1912451Z Feb 02 17:44:12                                  
>                (6, 'hello', '1603708293000')],
> 2022-02-02T17:44:12.1913456Z Feb 02 17:44:12                                  
>               type_info=Types.ROW([Types.INT(), Types.STRING(),
> 2022-02-02T17:44:12.1914338Z Feb 02 17:44:12                                  
>                                    Types.STRING()]))
> 2022-02-02T17:44:12.1914811Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1915317Z Feb 02 17:44:12         class 
> MyTimestampAssigner(TimestampAssigner):
> 2022-02-02T17:44:12.1915724Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1916782Z Feb 02 17:44:12             def 
> extract_timestamp(self, value, record_timestamp) -> int:
> 2022-02-02T17:44:12.1917621Z Feb 02 17:44:12                 return 
> int(value[2])
> 2022-02-02T17:44:12.1918262Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1918855Z Feb 02 17:44:12         class 
> MyProcessFunction(KeyedProcessFunction):
> 2022-02-02T17:44:12.1919363Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1919744Z Feb 02 17:44:12             def __init__(self):
> 2022-02-02T17:44:12.1920143Z Feb 02 17:44:12                 self.value_state 
> = None
> 2022-02-02T17:44:12.1920648Z Feb 02 17:44:12                 self.list_state 
> = None
> 2022-02-02T17:44:12.1921298Z Feb 02 17:44:12                 self.map_state = 
> None
> 2022-02-02T17:44:12.1921864Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1922479Z Feb 02 17:44:12             def open(self, 
> runtime_context: RuntimeContext):
> 2022-02-02T17:44:12.1923907Z Feb 02 17:44:12                 
> value_state_descriptor = ValueStateDescriptor('value_state', Types.INT())
> 2022-02-02T17:44:12.1924922Z Feb 02 17:44:12                 self.value_state 
> = runtime_context.get_state(value_state_descriptor)
> 2022-02-02T17:44:12.1925741Z Feb 02 17:44:12                 
> list_state_descriptor = ListStateDescriptor('list_state', Types.INT())
> 2022-02-02T17:44:12.1926482Z Feb 02 17:44:12                 self.list_state 
> = runtime_context.get_list_state(list_state_descriptor)
> 2022-02-02T17:44:12.1927465Z Feb 02 17:44:12                 
> map_state_descriptor = MapStateDescriptor('map_state', Types.INT(), 
> Types.STRING())
> 2022-02-02T17:44:12.1927998Z Feb 02 17:44:12                 state_ttl_config 
> = StateTtlConfig \
> 2022-02-02T17:44:12.1928444Z Feb 02 17:44:12                     
> .new_builder(Time.seconds(1)) \
> 2022-02-02T17:44:12.1928943Z Feb 02 17:44:12                     
> .set_update_type(StateTtlConfig.UpdateType.OnReadAndWrite) \
> 2022-02-02T17:44:12.1929462Z Feb 02 17:44:12                     
> .set_state_visibility(
> 2022-02-02T17:44:12.1929939Z Feb 02 17:44:12                         
> StateTtlConfig.StateVisibility.ReturnExpiredIfNotCleanedUp) \
> 2022-02-02T17:44:12.1930601Z Feb 02 17:44:12                     
> .disable_cleanup_in_background() \
> 2022-02-02T17:44:12.1931032Z Feb 02 17:44:12                     .build()
> 2022-02-02T17:44:12.1931480Z Feb 02 17:44:12                 
> map_state_descriptor.enable_time_to_live(state_ttl_config)
> 2022-02-02T17:44:12.1932018Z Feb 02 17:44:12                 self.map_state = 
> runtime_context.get_map_state(map_state_descriptor)
> 2022-02-02T17:44:12.1932610Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1933172Z Feb 02 17:44:12             def 
> process_element(self, value, ctx):
> 2022-02-02T17:44:12.1933623Z Feb 02 17:44:12                 import time
> 2022-02-02T17:44:12.1934007Z Feb 02 17:44:12                 time.sleep(1)
> 2022-02-02T17:44:12.1934419Z Feb 02 17:44:12                 current_value = 
> self.value_state.value()
> 2022-02-02T17:44:12.1934977Z Feb 02 17:44:12                 
> self.value_state.update(value[0])
> 2022-02-02T17:44:12.1935451Z Feb 02 17:44:12                 current_list = 
> [_ for _ in self.list_state.get()]
> 2022-02-02T17:44:12.1935921Z Feb 02 17:44:12                 
> self.list_state.add(value[0])
> 2022-02-02T17:44:12.1936401Z Feb 02 17:44:12                 map_entries = 
> {k: v for k, v in self.map_state.items()}
> 2022-02-02T17:44:12.1936862Z Feb 02 17:44:12                 keys = 
> sorted(map_entries.keys())
> 2022-02-02T17:44:12.1937649Z Feb 02 17:44:12                 
> map_entries_string = [str(k) + ': ' + str(map_entries[k]) for k in keys]
> 2022-02-02T17:44:12.1938404Z Feb 02 17:44:12                 
> map_entries_string = '{' + ', '.join(map_entries_string) + '}'
> 2022-02-02T17:44:12.1938906Z Feb 02 17:44:12                 
> self.map_state.put(value[0], value[1])
> 2022-02-02T17:44:12.1939350Z Feb 02 17:44:12                 current_key = 
> ctx.get_current_key()
> 2022-02-02T17:44:12.1939889Z Feb 02 17:44:12                 yield "current 
> key: {}, current value state: {}, current list state: {}, " \
> 2022-02-02T17:44:12.1940521Z Feb 02 17:44:12                       "current 
> map state: {}, current value: {}".format(str(current_key),
> 2022-02-02T17:44:12.1941111Z Feb 02 17:44:12                                  
>                                        str(current_value),
> 2022-02-02T17:44:12.1941645Z Feb 02 17:44:12                                  
>                                        str(current_list),
> 2022-02-02T17:44:12.1942254Z Feb 02 17:44:12                                  
>                                        map_entries_string,
> 2022-02-02T17:44:12.1942796Z Feb 02 17:44:12                                  
>                                        str(value))
> 2022-02-02T17:44:12.1943369Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1943761Z Feb 02 17:44:12             def on_timer(self, 
> timestamp, ctx):
> 2022-02-02T17:44:12.1944178Z Feb 02 17:44:12                 pass
> 2022-02-02T17:44:12.1944503Z Feb 02 17:44:12     
> 2022-02-02T17:44:12.1944898Z Feb 02 17:44:12         watermark_strategy = 
> WatermarkStrategy.for_monotonous_timestamps() \
> 2022-02-02T17:44:12.1945537Z Feb 02 17:44:12             
> .with_timestamp_assigner(MyTimestampAssigner())
> 2022-02-02T17:44:12.1946018Z Feb 02 17:44:12         
> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
> 2022-02-02T17:44:12.1946525Z Feb 02 17:44:12             .key_by(lambda x: 
> x[1], key_type=Types.STRING()) \
> 2022-02-02T17:44:12.1947019Z Feb 02 17:44:12             
> .process(MyProcessFunction(), output_type=Types.STRING()) \
> 2022-02-02T17:44:12.1947465Z Feb 02 17:44:12             
> .add_sink(self.test_sink)
> 2022-02-02T17:44:12.1948146Z Feb 02 17:44:12         self.env.execute('test 
> time stamp assigner with keyed process function')
> 2022-02-02T17:44:12.1948637Z Feb 02 17:44:12         results = 
> self.test_sink.get_results()
> 2022-02-02T17:44:12.1949166Z Feb 02 17:44:12         expected = ["current 
> key: hi, current value state: None, current list state: [], "
> 2022-02-02T17:44:12.1949957Z Feb 02 17:44:12                     "current map 
> state: {}, current value: Row(f0=1, f1='hi', "
> 2022-02-02T17:44:12.1950624Z Feb 02 17:44:12                     
> "f2='1603708211000')",
> 2022-02-02T17:44:12.1951234Z Feb 02 17:44:12                     "current 
> key: hello, current value state: None, "
> 2022-02-02T17:44:12.1951822Z Feb 02 17:44:12                     "current 
> list state: [], current map state: {}, current value: Row(f0=2,"
> 2022-02-02T17:44:12.1952596Z Feb 02 17:44:12                     " 
> f1='hello', f2='1603708224000')",
> 2022-02-02T17:44:12.1953292Z Feb 02 17:44:12                     "current 
> key: hi, current value state: 1, current list state: [1], "
> 2022-02-02T17:44:12.1954134Z Feb 02 17:44:12                     "current map 
> state: {1: hi}, current value: Row(f0=3, f1='hi', "
> 2022-02-02T17:44:12.1954799Z Feb 02 17:44:12                     
> "f2='1603708226000')",
> 2022-02-02T17:44:12.1955331Z Feb 02 17:44:12                     "current 
> key: hello, current value state: 2, current list state: [2], "
> 2022-02-02T17:44:12.1956145Z Feb 02 17:44:12                     "current map 
> state: {2: hello}, current value: Row(f0=4, f1='hello', "
> 2022-02-02T17:44:12.1956826Z Feb 02 17:44:12                     
> "f2='1603708289000')",
> 2022-02-02T17:44:12.1957362Z Feb 02 17:44:12                     "current 
> key: hi, current value state: 3, current list state: [1, 3], "
> 2022-02-02T17:44:12.1958156Z Feb 02 17:44:12                     "current map 
> state: {1: hi, 3: hi}, current value: Row(f0=5, f1='hi', "
> 2022-02-02T17:44:12.1958845Z Feb 02 17:44:12                     
> "f2='1603708291000')",
> 2022-02-02T17:44:12.1959382Z Feb 02 17:44:12                     "current 
> key: hello, current value state: 4, current list state: [2, 4],"
> 2022-02-02T17:44:12.1960011Z Feb 02 17:44:12                     " current 
> map state: {2: hello, 4: hello}, current value: Row(f0=6, "
> 2022-02-02T17:44:12.1960715Z Feb 02 17:44:12                     "f1='hello', 
> f2='1603708293000')"]
> 2022-02-02T17:44:12.1961159Z Feb 02 17:44:12 >       
> self.assert_equals_sorted(expected, results)
> 2022-02-02T17:44:12.1961533Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1961906Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_data_stream.py:683: 
> 2022-02-02T17:44:12.1962464Z Feb 02 17:44:12 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> 2022-02-02T17:44:12.1963186Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_data_stream.py:62: in assert_equals_sorted
> 2022-02-02T17:44:12.1963670Z Feb 02 17:44:12     self.assertEqual(expected, 
> actual)
> 2022-02-02T17:44:12.1964685Z Feb 02 17:44:12 E   AssertionError: Lists 
> differ: ["cur[719 chars]te: {1: hi, 3: hi}, current value: Row(f0=5, f[172 
> chars]0')"] != ["cur[719 chars]te: {3: hi}, current value: Row(f0=5, 
> f1='hi',[165 chars]0')"]
> 2022-02-02T17:44:12.1965369Z Feb 02 17:44:12 E   
> 2022-02-02T17:44:12.1965731Z Feb 02 17:44:12 E   First differing element 4:
> 2022-02-02T17:44:12.1966428Z Feb 02 17:44:12 E   "curr[80 chars]te: {1: hi, 
> 3: hi}, current value: Row(f0=5, f[23 chars]00')"
> 2022-02-02T17:44:12.1967192Z Feb 02 17:44:12 E   "curr[80 chars]te: {3: hi}, 
> current value: Row(f0=5, f1='hi',[16 chars]00')"
> 2022-02-02T17:44:12.1967860Z Feb 02 17:44:12 E   
> 2022-02-02T17:44:12.1968268Z Feb 02 17:44:12 E   Diff is 1211 characters 
> long. Set self.maxDiff to None to see it.
> 2022-02-02T17:44:12.1968783Z Feb 02 17:44:12 =============================== 
> warnings summary ===============================
> 2022-02-02T17:44:12.1969374Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_add_classpaths
> 2022-02-02T17:44:12.1970541Z Feb 02 17:44:12   
> /__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/future/standard_library/__init__.py:65:
>  DeprecationWarning: the imp module is deprecated in favour of importlib; see 
> the module's documentation for alternative uses
> 2022-02-02T17:44:12.1971219Z Feb 02 17:44:12     import imp
> 2022-02-02T17:44:12.1971530Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1972027Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_add_classpaths
> 2022-02-02T17:44:12.1973535Z Feb 02 17:44:12   
> /__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/apache_beam/typehints/typehints.py:693:
>  DeprecationWarning: Using or importing the ABCs from 'collections' instead 
> of from 'collections.abc' is deprecated since Python 3.3, and in 3.10 it will 
> stop working
> 2022-02-02T17:44:12.1974526Z Feb 02 17:44:12     if not 
> isinstance(type_params, collections.Iterable):
> 2022-02-02T17:44:12.1974930Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1975407Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_add_classpaths
> 2022-02-02T17:44:12.1976680Z Feb 02 17:44:12   
> /__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/apache_beam/typehints/typehints.py:532:
>  DeprecationWarning: Using or importing the ABCs from 'collections' instead 
> of from 'collections.abc' is deprecated since Python 3.3, and in 3.10 it will 
> stop working
> 2022-02-02T17:44:12.1977509Z Feb 02 17:44:12     if not 
> isinstance(type_params, (collections.Sequence, set)):
> 2022-02-02T17:44:12.1977939Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1978432Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_add_python_archive
> 2022-02-02T17:44:12.1979475Z Feb 02 17:44:12   
> /__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/_pytest/threadexception.py:75:
>  PytestUnhandledThreadExceptionWarning: Exception in thread 
> read_grpc_client_inputs
> 2022-02-02T17:44:12.1980064Z Feb 02 17:44:12   
> 2022-02-02T17:44:12.1980434Z Feb 02 17:44:12   Traceback (most recent call 
> last):
> 2022-02-02T17:44:12.1981152Z Feb 02 17:44:12     File 
> "/__w/1/s/flink-python/dev/.conda/envs/3.8/lib/python3.8/threading.py", line 
> 932, in _bootstrap_inner
> 2022-02-02T17:44:12.1981642Z Feb 02 17:44:12       self.run()
> 2022-02-02T17:44:12.1982368Z Feb 02 17:44:12     File 
> "/__w/1/s/flink-python/dev/.conda/envs/3.8/lib/python3.8/threading.py", line 
> 870, in run
> 2022-02-02T17:44:12.1982884Z Feb 02 17:44:12       self._target(*self._args, 
> **self._kwargs)
> 2022-02-02T17:44:12.1983865Z Feb 02 17:44:12     File 
> "/__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py",
>  line 598, in <lambda>
> 2022-02-02T17:44:12.1984471Z Feb 02 17:44:12       target=lambda: 
> self._read_inputs(elements_iterator),
> 2022-02-02T17:44:12.1985299Z Feb 02 17:44:12     File 
> "/__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py",
>  line 581, in _read_inputs
> 2022-02-02T17:44:12.1985881Z Feb 02 17:44:12       for elements in 
> elements_iterator:
> 2022-02-02T17:44:12.1986760Z Feb 02 17:44:12     File 
> "/__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/grpc/_channel.py",
>  line 426, in __next__
> 2022-02-02T17:44:12.1987262Z Feb 02 17:44:12       return self._next()
> 2022-02-02T17:44:12.1987946Z Feb 02 17:44:12     File 
> "/__w/1/s/flink-python/.tox/py38/lib/python3.8/site-packages/grpc/_channel.py",
>  line 826, in _next
> 2022-02-02T17:44:12.1988425Z Feb 02 17:44:12       raise self
> 2022-02-02T17:44:12.1989023Z Feb 02 17:44:12   
> grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
> terminated with:
> 2022-02-02T17:44:12.1990067Z Feb 02 17:44:12          status = 
> StatusCode.CANCELLED
> 2022-02-02T17:44:12.1990653Z Feb 02 17:44:12          details = "Multiplexer 
> hanging up"
> 2022-02-02T17:44:12.1991849Z Feb 02 17:44:12          debug_error_string = 
> "{"created":"@1643823819.576493566","description":"Error received from peer 
> ipv4:127.0.0.1:33091","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"Multiplexer
>  hanging up","grpc_status":1}"
> 2022-02-02T17:44:12.1993432Z Feb 02 17:44:12   >
> 2022-02-02T17:44:12.1993889Z Feb 02 17:44:12   
> 2022-02-02T17:44:12.1994521Z Feb 02 17:44:12     
> warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg))
> 2022-02-02T17:44:12.1995279Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1996037Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_add_python_file
> 2022-02-02T17:44:12.1997435Z Feb 02 17:44:12   
> /__w/1/s/flink-python/pyflink/table/table_environment.py:1997: 
> DeprecationWarning: Deprecated in 1.12. Use from_data_stream(DataStream, 
> *Expression) instead.
> 2022-02-02T17:44:12.1998269Z Feb 02 17:44:12     warnings.warn(
> 2022-02-02T17:44:12.1998594Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.1999075Z Feb 02 17:44:12 
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_execute
> 2022-02-02T17:44:12.2000008Z Feb 02 17:44:12   
> /__w/1/s/flink-python/pyflink/table/table_environment.py:538: 
> DeprecationWarning: Deprecated in 1.10. Use create_table instead.
> 2022-02-02T17:44:12.2000823Z Feb 02 17:44:12     warnings.warn("Deprecated in 
> 1.10. Use create_table instead.", DeprecationWarning)
> 2022-02-02T17:44:12.2001522Z Feb 02 17:44:12 
> 2022-02-02T17:44:12.2002614Z Feb 02 17:44:12 -- Docs: 
> https://docs.pytest.org/en/stable/warnings.html
> 2022-02-02T17:44:12.2003603Z Feb 02 17:44:12 ============================= 
> slowest 20 durations =============================
> 2022-02-02T17:44:12.2004618Z Feb 02 17:44:12 10.16s call     
> pyflink/datastream/tests/test_connectors.py::ConnectorTests::test_stream_file_sink
> 2022-02-02T17:44:12.2005726Z Feb 02 17:44:12 9.83s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_keyed_process_function_with_state
> 2022-02-02T17:44:12.2006511Z Feb 02 17:44:12 8.79s call     
> pyflink/datastream/tests/test_data_stream.py::StreamingModeDataStreamTests::test_keyed_process_function_with_state
> 2022-02-02T17:44:12.2007232Z Feb 02 17:44:12 6.78s call     
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_add_python_file
> 2022-02-02T17:44:12.2007961Z Feb 02 17:44:12 5.52s call     
> pyflink/datastream/tests/test_data_stream.py::StreamingModeDataStreamTests::test_execute_and_collect
> 2022-02-02T17:44:12.2009001Z Feb 02 17:44:12 5.44s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_execute_and_collect
> 2022-02-02T17:44:12.2010033Z Feb 02 17:44:12 5.26s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_basic_co_operations_with_output_type
> 2022-02-02T17:44:12.2011152Z Feb 02 17:44:12 5.25s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_basic_co_operations
> 2022-02-02T17:44:12.2012377Z Feb 02 17:44:12 4.53s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_keyed_co_process
> 2022-02-02T17:44:12.2013701Z Feb 02 17:44:12 4.35s call     
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_set_requirements_with_cached_directory
> 2022-02-02T17:44:12.2014884Z Feb 02 17:44:12 4.32s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_reduce_with_state
> 2022-02-02T17:44:12.2015900Z Feb 02 17:44:12 4.26s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_keyed_flat_map
> 2022-02-02T17:44:12.2016970Z Feb 02 17:44:12 4.21s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_keyed_co_map
> 2022-02-02T17:44:12.2018270Z Feb 02 17:44:12 4.06s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_keyed_filter
> 2022-02-02T17:44:12.2019463Z Feb 02 17:44:12 3.90s call     
> pyflink/datastream/tests/test_stream_execution_environment.py::StreamExecutionEnvironmentTests::test_set_requirements_without_cached_directory
> 2022-02-02T17:44:12.2020715Z Feb 02 17:44:12 3.90s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_aggregating_state
> 2022-02-02T17:44:12.2021749Z Feb 02 17:44:12 3.87s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_keyed_co_flat_map
> 2022-02-02T17:44:12.2022862Z Feb 02 17:44:12 3.84s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_multi_key_by
> 2022-02-02T17:44:12.2024078Z Feb 02 17:44:12 3.83s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_time_window
> 2022-02-02T17:44:12.2024912Z Feb 02 17:44:12 3.83s call     
> pyflink/datastream/tests/test_data_stream.py::BatchModeDataStreamTests::test_count_window
> 2022-02-02T17:44:12.2025750Z Feb 02 17:44:12 =========================== 
> short test summary info ============================
> 2022-02-02T17:44:12.2026370Z Feb 02 17:44:12 FAILED 
> pyflink/datastream/tests/test_data_stream.py::StreamingModeDataStreamTests::test_keyed_process_function_with_state
> 2022-02-02T17:44:12.2027008Z Feb 02 17:44:12 ======= 1 failed, 154 passed, 1 
> skipped, 6 warnings in 235.76s (0:03:55) =======
> 2022-02-02T17:44:12.5428501Z Feb 02 17:44:12 test module 
> /__w/1/s/flink-python/pyflink/datastream failed
> 2022-02-02T17:44:12.5431151Z Feb 02 17:44:12 ERROR: InvocationError for 
> command /bin/bash ./dev/integration_test.sh (exited with code 1)
> 2022-02-02T17:44:12.5432097Z Feb 02 17:44:12 py38 finish: run-test  after 
> 999.77 seconds
> 2022-02-02T17:44:12.5436171Z Feb 02 17:44:12 py38 start: run-test-post 
> 2022-02-02T17:44:12.5437071Z Feb 02 17:44:12 py38 finish: run-test-post  
> after 0.00 seconds
> 2022-02-02T17:44:12.5438162Z Feb 02 17:44:12 
> ___________________________________ summary 
> ____________________________________
> 2022-02-02T17:44:12.5453873Z Feb 02 17:44:12 ERROR:   py38: commands failed
> 2022-02-02T17:44:12.5455066Z Feb 02 17:44:12 cleanup 
> /__w/1/s/flink-python/.tox/.tmp/package/1/apache-flink-1.15.dev0.zip
> 2022-02-02T17:44:12.6013749Z Feb 02 17:44:12 ============tox checks... 
> [FAILED]============
> {code}
> https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=30642&view=logs&j=9cada3cb-c1d3-5621-16da-0f718fb86602&t=c67e71ed-6451-5d26-8920-5a8cf9651901&l=24759



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to