See <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/5208/display/redirect?page=changes>
Changes: [noreply] [BEAM-13736] Make lifting cache exact. (#16603) [noreply] Merge pull request #16565 from [BEAM-13692][Playground] Implement [noreply] Merge pull request #16502 from [BEAM-13650][Playground] Add link for ------------------------------------------ [...truncated 1.11 MB...] apache_beam/io/filesystems_test.py:62 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), # pylint: disable=anomalous-backslash-in-string target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py:42 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-5/py38-pyarrow-5/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/io/parquetio_test.py:419 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:419: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead for batch in orig.to_batches(chunksize=20) apache_beam/io/parquetio_test.py:380 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:380: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/io/parquetio_test.py:370 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:370: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/dataframe/io.py:629 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/dataframe/io.py>:629: FutureWarning: WriteToFiles is experimental. return pcoll | fileio.WriteToFiles( apache_beam/io/fileio.py:550 apache_beam/io/fileio.py:550 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/pytest_py38-pyarrow-5.xml> - ============== 29 passed, 1 skipped, 20 warnings in 35.73 seconds ============== [1mpy38-pyarrow-5 run-test-post: commands[0] | bash <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh> [0m___________________________________ summary ____________________________________ [32m py38-pyarrow-5: commands succeeded [0m[32m congratulations :) [0m > Task :sdks:python:test-suites:tox:py38:testPy38pyarrow-6 [1mGLOB sdist-make: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/setup.py> [0m[1mpy38-pyarrow-6 create: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-6/py38-pyarrow-6> [0m[1mpy38-pyarrow-6 installdeps: pyarrow>=6,<7 [0m[1mpy38-pyarrow-6 inst: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-6/.tmp/package/1/apache-beam-2.37.0.dev0.zip> [0m[1mpy38-pyarrow-6 installed: apache-beam @ file://<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-6/.tmp/package/1/apache-beam-2.37.0.dev0.zip,atomicwrites==1.4.0,attrs==21.4.0,certifi==2021.10.8,charset-normalizer==2.0.10,cloudpickle==2.0.0,crcmod==1.7,deprecation==2.1.0,dill==0.3.1.1,docker==5.0.3,docopt==0.6.2,execnet==1.9.0,fastavro==1.4.9,freezegun==1.1.0,greenlet==1.1.2,grpcio==1.43.0,hdfs==2.6.0,httplib2==0.19.1,idna==3.3,mock==2.0.0,more-itertools==8.12.0,numpy==1.21.5,oauth2client==4.1.3,orjson==3.6.6,packaging==21.3,pandas==1.3.5,parameterized==0.7.5,pbr==5.8.0,pluggy==0.13.1,proto-plus==1.19.8,protobuf==3.19.3,psycopg2-binary==2.9.3,py==1.11.0,pyarrow==6.0.1,pyasn1==0.4.8,pyasn1-modules==0.2.8,pydot==1.4.2,PyHamcrest==1.10.1,pymongo==3.12.3,pyparsing==2.4.7,pytest==4.6.11,pytest-forked==1.4.0,pytest-timeout==1.4.2,pytest-xdist==1.34.0,python-dateutil==2.8.2,pytz==2021.3,PyYAML==6.0,requests==2.27.1,requests-mock==1.9.3,rsa==4.8,six==1.16.0,SQLAlchemy==1.4.31,tenacity==5.1.5,testcontainers==3.4.2,typing_extensions==4.0.1,urllib3==1.26.8,wcwidth==0.2.5,websocket-client==1.2.3,wrapt==1.13.3> [0m[1mpy38-pyarrow-6 run-test-pre: PYTHONHASHSEED='3653091649' [0m[1mpy38-pyarrow-6 run-test-pre: commands[0] | python --version [0mPython 3.8.9 [1mpy38-pyarrow-6 run-test-pre: commands[1] | pip --version [0mpip 21.3.1 from <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/pip> (python 3.8) [1mpy38-pyarrow-6 run-test-pre: commands[2] | pip check [0mNo broken requirements found. [1mpy38-pyarrow-6 run-test-pre: commands[3] | bash <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh> [0m[1mpy38-pyarrow-6 run-test: commands[0] | /bin/sh -c 'pip freeze | grep -E '"'"'(pyarrow|numpy)'"'"'' [0mapache-beam @ file://<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-6/.tmp/package/1/apache-beam-2.37.0.dev0.zip> numpy==1.21.5 pyarrow==6.0.1 [1mpy38-pyarrow-6 run-test: commands[1] | pytest -o junit_suite_name=py38-pyarrow-6 --junitxml=pytest_py38-pyarrow-6.xml -n 6 -m uses_pyarrow [0m============================= test session starts ============================== platform linux -- Python 3.8.9, pytest-4.6.11, py-1.11.0, pluggy-0.13.1 cachedir: target/.tox-py38-pyarrow-6/py38-pyarrow-6/.pytest_cache rootdir: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python,> inifile: pytest.ini plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3 timeout: 600.0s timeout method: signal timeout func_only: False gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I gw0 [29] / gw1 [29] / gw2 [29] / gw3 [29] / gw4 [29] / gw5 [29] ............................. [100%] =============================== warnings summary =============================== target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py:42 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-6/py38-pyarrow-6/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint: disable=anomalous-backslash-in-string apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), # pylint: disable=anomalous-backslash-in-string apache_beam/io/parquetio_test.py:419 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:419: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead for batch in orig.to_batches(chunksize=20) apache_beam/io/parquetio_test.py:380 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:380: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/io/parquetio_test.py:370 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:370: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/dataframe/io.py:629 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/dataframe/io.py>:629: FutureWarning: WriteToFiles is experimental. return pcoll | fileio.WriteToFiles( apache_beam/io/fileio.py:550 apache_beam/io/fileio.py:550 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/pytest_py38-pyarrow-6.xml> - ============== 29 passed, 1 skipped, 20 warnings in 34.72 seconds ============== [1mpy38-pyarrow-6 run-test-post: commands[0] | bash <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh> [0m___________________________________ summary ____________________________________ [32m py38-pyarrow-6: commands succeeded [0m[32m congratulations :) [0m > Task :sdks:python:test-suites:tox:py38:preCommitPy38 > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 F =================================== FAILURES =================================== _______________ StreamingWordCountIT.test_streaming_wordcount_it _______________ [gw0] linux -- Python 3.7.10 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/build/gradleenv/-1734967052/bin/python3.7> self = <apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT testMethod=test_streaming_wordcount_it> @pytest.mark.it_postcommit def test_streaming_wordcount_it(self): # Build expected dataset. expected_msg = [('%d: 1' % num).encode('utf-8') for num in range(DEFAULT_INPUT_NUMBERS)] # Set extra options to the pipeline for test purpose state_verifier = PipelineStateMatcher(PipelineState.RUNNING) pubsub_msg_verifier = PubSubMessageMatcher( self.project, self.output_sub.name, expected_msg, timeout=400) extra_opts = { 'input_subscription': self.input_sub.name, 'output_topic': self.output_topic.name, 'wait_until_finish_duration': WAIT_UNTIL_FINISH_DURATION, 'on_success_matcher': all_of(state_verifier, pubsub_msg_verifier) } # Generate input data and inject to PubSub. self._inject_numbers(self.input_topic, DEFAULT_INPUT_NUMBERS) # Get pipeline options from command argument: --test-pipeline-options, # and start pipeline job by calling pipeline main function. streaming_wordcount.run( self.test_pipeline.get_full_options_as_args(**extra_opts), > save_main_session=False) apache_beam/examples/streaming_wordcount_it_test.py:114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ apache_beam/examples/streaming_wordcount.py:103: in run output | beam.io.WriteToPubSub(known_args.output_topic) apache_beam/pipeline.py:596: in __exit__ self.result = self.run() apache_beam/pipeline.py:573: in run return self.runner.run_pipeline(self, self._options) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.runners.dataflow.test_dataflow_runner.TestDataflowRunner object at 0x7f7e7c2c4d50> pipeline = <apache_beam.pipeline.Pipeline object at 0x7f7eb7450d90> options = <apache_beam.options.pipeline_options.PipelineOptions object at 0x7f7e7d6dabd0> def run_pipeline(self, pipeline, options): """Execute test pipeline and verify test matcher""" test_options = options.view_as(TestOptions) on_success_matcher = test_options.on_success_matcher wait_duration = test_options.wait_until_finish_duration is_streaming = options.view_as(StandardOptions).streaming # [BEAM-1889] Do not send this to remote workers also, there is no need to # send this option to remote executors. test_options.on_success_matcher = None self.result = super().run_pipeline(pipeline, options) if self.result.has_job: # TODO(markflyhigh)(BEAM-1890): Use print since Nose dosen't show logs # in some cases. print('Worker logs: %s' % self.build_console_url(options)) try: self.wait_until_in_state(PipelineState.RUNNING) if is_streaming and not wait_duration: _LOGGER.warning('Waiting indefinitely for streaming job.') self.result.wait_until_finish(duration=wait_duration) if on_success_matcher: from hamcrest import assert_that as hc_assert_that > hc_assert_that(self.result, pickler.loads(on_success_matcher)) E AssertionError: E Expected: (Test pipeline expected terminated in state: RUNNING and Expected 500 messages.) E but: Expected 500 messages. Got 503 messages. Diffs (item, count): E Expected but not in actual: dict_items([(b'3: 1', 1), (b'11: 1', 1), (b'31: 1', 1), (b'47: 1', 1), (b'81: 1', 1), (b'89: 1', 1), (b'94: 1', 1), (b'96: 1', 1), (b'130: 1', 1), (b'134: 1', 1), (b'174: 1', 1), (b'189: 1', 1), (b'190: 1', 1), (b'198: 1', 1), (b'209: 1', 1), (b'240: 1', 1), (b'266: 1', 1), (b'279: 1', 1), (b'313: 1', 1), (b'314: 1', 1), (b'392: 1', 1), (b'396: 1', 1), (b'397: 1', 1), (b'486: 1', 1)]) E Unexpected: dict_items([(b'280: 1', 1), (b'119: 1', 1), (b'496: 1', 1), (b'121: 1', 1), (b'109: 1', 1), (b'84: 1', 1), (b'215: 1', 1), (b'276: 1', 1), (b'302: 1', 1), (b'115: 1', 1), (b'114: 1', 1), (b'255: 1', 1), (b'80: 1', 1), (b'117: 1', 1), (b'307: 1', 1), (b'10: 1', 1), (b'370: 1', 1), (b'44: 1', 1), (b'86: 1', 1), (b'350: 1', 1), (b'447: 1', 1), (b'327: 1', 1), (b'120: 1', 1), (b'487: 1', 1), (b'33: 1', 1), (b'470: 1', 1), (b'5: 1', 1)]) apache_beam/runners/dataflow/test_dataflow_runner.py:68: AssertionError ------------------------------ Captured log call ------------------------------- WARNING root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.7 interpreter. WARNING apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20'] WARNING apache_beam.options.pipeline_options:pipeline_options.py:309 Discarding unparseable args: ['--sleep_secs=20'] WARNING apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:250 Timing out on waiting for job 2022-01-25_10_45_11-6997691572610778568 after 364 seconds - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py37.xml> - ========================== 1 failed in 725.19 seconds ========================== > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 FAILED > Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2 . - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py36.xml> - ========================== 1 passed in 754.35 seconds ========================== FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 80 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 51m 2s 120 actionable tasks: 85 executed, 33 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/5nb4ze5zqnxuk Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
