See <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/6402/display/redirect?page=changes>
Changes: [noreply] Add test configurations for deterministic outputs on Dataflow (#24325) [noreply] Updates ExpansionService to support dynamically discovering and ------------------------------------------ [...truncated 2.45 MB...] ================= 42 passed, 5 skipped, 37 warnings in 14.61s ================== [1mpy38-pytorch-19 run-test-post: commands[0] | bash <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh> [0m___________________________________ summary ____________________________________ [32m py38-pytorch-19: commands succeeded [0m[32m congratulations :) [0m > Task :sdks:python:test-suites:tox:py38:preCommitPy38 > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 [gw0] PASSED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it =============================== warnings summary =============================== ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py37.xml> - ================== 1 passed, 3 warnings in 803.04s (0:13:23) =================== > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_streaming_V2 [gw0] FAILED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it =================================== FAILURES =================================== _______________ StreamingWordCountIT.test_streaming_wordcount_it _______________ [gw0] linux -- Python 3.10.2 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/build/gradleenv/2050596098/bin/python3.10> self = <apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT testMethod=test_streaming_wordcount_it> @pytest.mark.it_postcommit def test_streaming_wordcount_it(self): # Build expected dataset. expected_msg = [('%d: 1' % num).encode('utf-8') for num in range(DEFAULT_INPUT_NUMBERS)] # Set extra options to the pipeline for test purpose state_verifier = PipelineStateMatcher(PipelineState.RUNNING) pubsub_msg_verifier = PubSubMessageMatcher( self.project, self.output_sub.name, expected_msg, timeout=400) extra_opts = { 'input_subscription': self.input_sub.name, 'output_topic': self.output_topic.name, 'wait_until_finish_duration': WAIT_UNTIL_FINISH_DURATION, 'on_success_matcher': all_of(state_verifier, pubsub_msg_verifier) } # Generate input data and inject to PubSub. self._inject_numbers(self.input_topic, DEFAULT_INPUT_NUMBERS) # Get pipeline options from command argument: --test-pipeline-options, # and start pipeline job by calling pipeline main function. > streaming_wordcount.run( self.test_pipeline.get_full_options_as_args(**extra_opts), save_main_session=False) apache_beam/examples/streaming_wordcount_it_test.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ apache_beam/examples/streaming_wordcount.py:61: in run with beam.Pipeline(options=pipeline_options) as p: apache_beam/pipeline.py:600: in __exit__ self.result = self.run() apache_beam/pipeline.py:577: in run return self.runner.run_pipeline(self, self._options) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.runners.dataflow.test_dataflow_runner.TestDataflowRunner object at 0x7f9a7054f2e0> pipeline = <apache_beam.pipeline.Pipeline object at 0x7f9a7054e440> options = <apache_beam.options.pipeline_options.PipelineOptions object at 0x7f9a7054e650> def run_pipeline(self, pipeline, options): """Execute test pipeline and verify test matcher""" test_options = options.view_as(TestOptions) on_success_matcher = test_options.on_success_matcher wait_duration = test_options.wait_until_finish_duration is_streaming = options.view_as(StandardOptions).streaming # [BEAM-1889] Do not send this to remote workers also, there is no need to # send this option to remote executors. test_options.on_success_matcher = None self.result = super().run_pipeline(pipeline, options) if self.result.has_job: # TODO(markflyhigh)(https://github.com/apache/beam/issues/18254): Use # print since Nose dosen't show logs in some cases. print('Worker logs: %s' % self.build_console_url(options)) _LOGGER.info('Console log: ') _LOGGER.info(self.build_console_url(options)) try: self.wait_until_in_state(PipelineState.RUNNING) if is_streaming and not wait_duration: _LOGGER.warning('Waiting indefinitely for streaming job.') self.result.wait_until_finish(duration=wait_duration) if on_success_matcher: from hamcrest import assert_that as hc_assert_that > hc_assert_that(self.result, pickler.loads(on_success_matcher)) E AssertionError: E Expected: (Test pipeline expected terminated in state: RUNNING and Expected 500 messages.) E but: Expected 500 messages. Got 514 messages. Diffs (item, count): E Expected but not in actual: dict_items([(b'218: 1', 1), (b'230: 1', 1), (b'260: 1', 1), (b'264: 1', 1), (b'269: 1', 1), (b'281: 1', 1), (b'294: 1', 1), (b'400: 1', 1), (b'441: 1', 1)]) E Unexpected: dict_items([(b'132: 1', 1), (b'476: 1', 1), (b'233: 1', 1), (b'486: 1', 1), (b'434: 1', 1), (b'277: 1', 1), (b'212: 1', 1), (b'472: 1', 1), (b'467: 1', 1), (b'251: 1', 1), (b'475: 1', 1), (b'48: 1', 1), (b'448: 1', 1), (b'271: 1', 1), (b'101: 1', 1), (b'279: 1', 1), (b'485: 1', 1), (b'420: 1', 1), (b'198: 1', 1), (b'216: 1', 1), (b'249: 1', 1), (b'478: 1', 1), (b'422: 1', 1)]) E Unexpected (with all details): [(b'132: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 24, 651000, tzinfo=datetime.timezone.utc), ''), (b'476: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'233: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'486: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'132: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 330000, tzinfo=datetime.timezone.utc), ''), (b'434: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'277: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'212: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'472: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'467: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'472: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'251: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'475: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'48: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'448: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'271: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'212: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'467: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'101: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'279: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'485: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'420: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'198: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'476: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'233: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'434: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'277: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'216: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'249: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'478: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'422: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'486: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'251: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'475: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'48: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'448: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'271: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'198: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'101: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'279: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'485: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'420: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'216: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'249: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'478: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'422: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), '')] apache_beam/runners/dataflow/test_dataflow_runner.py:70: AssertionError ------------------------------ Captured log call ------------------------------- INFO apache_beam.runners.portability.stager:stager.py:780 Executing command: ['<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/build/gradleenv/2050596098/bin/python3.10',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpejv1lyas/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp310', '--platform', 'manylinux2014_x86_64'] INFO apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:484 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild INFO root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.10_sdk:2.44.0.dev INFO root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20221122 INFO root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20221122" for Docker environment INFO apache_beam.internal.gcp.auth:auth.py:130 Setting socket default timeout to 60 seconds. INFO apache_beam.internal.gcp.auth:auth.py:132 socket default timeout is 60.0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/requirements.txt... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/requirements.txt in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/mock-2.0.0-py2.py3-none-any.whl... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/mock-2.0.0-py2.py3-none-any.whl in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/seaborn-0.12.1-py3-none-any.whl... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/seaborn-0.12.1-py3-none-any.whl in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/PyHamcrest-1.10.1-py3-none-any.whl... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/beautifulsoup4-4.11.1-py3-none-any.whl... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/parameterized-0.7.5-py2.py3-none-any.whl... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/dataflow_python_sdk.tar... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/dataflow_python_sdk.tar in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/matplotlib-3.6.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/matplotlib-3.6.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/dataflow-worker.jar... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/dataflow-worker.jar in 5 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/pipeline.pb... INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1124020658-132746-vfw193y3.1669255618.133029/pipeline.pb in 0 seconds. INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job clientRequestId: '20221124020658134574-2410' createTime: '2022-11-24T02:07:06.876277Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2022-11-23_18_07_06-10773682750582623979' location: 'us-central1' name: 'beamapp-jenkins-1124020658-132746-vfw193y3' projectId: 'apache-beam-testing' stageStates: [] startTime: '2022-11-24T02:07:06.876277Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-11-23_18_07_06-10773682750582623979] INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-11-23_18_07_06-10773682750582623979 INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:915 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-23_18_07_06-10773682750582623979?project=apache-beam-testing INFO apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: INFO apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-23_18_07_06-10773682750582623979?project=apache-beam-testing INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-23_18_07_06-10773682750582623979 is in state JOB_STATE_RUNNING INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:07.604Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:07.715Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-11-23_18_07_06-10773682750582623979. The number of workers will be between 1 and 100. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:07.750Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-11-23_18_07_06-10773682750582623979. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:11.866Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.344Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.384Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.464Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.513Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.552Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.583Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.660Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.736Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.795Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.839Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.875Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.907Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.941Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.966Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn) INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:14.988Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.012Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.047Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.091Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.124Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into encode INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.155Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write into WriteToPubSub/ToProtobuf INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.206Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.243Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.280Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.313Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:15.347Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:16.442Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:16.486Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:16.535Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:07:42.202Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:08:00.075Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:08:37.570Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-24T02:08:51.442Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. WARNING apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:254 Timing out on waiting for job 2022-11-23_18_07_06-10773682750582623979 after 360 seconds =============================== warnings summary =============================== ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses from imp import load_source apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/build/gradleenv/2050596098/lib/python3.10/site-packages/httplib2/__init__.py>:147: DeprecationWarning: ssl.PROTOCOL_TLS is deprecated context = ssl.SSLContext(DEFAULT_TLS_VERSION) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py310.xml> - =========================== short test summary info ============================ FAILED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it - AssertionError: Expected: (Test pipeline expected terminated in state: RUNNING and Expected 500 messages.) but: Expected 500 messages. Got 514 messages. Diffs (item, count): Expected but not in actual: dict_items([(b'218: 1', 1), (b'230: 1', 1), (b'260: 1', 1), (b'264: 1', 1), (b'269: 1', 1), (b'281: 1', 1), (b'294: 1', 1), (b'400: 1', 1), (b'441: 1', 1)]) Unexpected: dict_items([(b'132: 1', 1), (b'476: 1', 1), (b'233: 1', 1), (b'486: 1', 1), (b'434: 1', 1), (b'277: 1', 1), (b'212: 1', 1), (b'472: 1', 1), (b'467: 1', 1), (b'251: 1', 1), (b'475: 1', 1), (b'48: 1', 1), (b'448: 1', 1), (b'271: 1', 1), (b'101: 1', 1), (b'279: 1', 1), (b'485: 1', 1), (b'420: 1', 1), (b'198: 1', 1), (b'216: 1', 1), (b'249: 1', 1), (b'478: 1', 1), (b'422: 1', 1)]) Unexpected (with all details): [(b'132: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 24, 651000, tzinfo=datetime.timezone.utc), ''), (b'476: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'233: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'486: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'132: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 330000, tzinfo=datetime.timezone.utc), ''), (b'434: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'277: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'212: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'472: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'467: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'472: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'251: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'475: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'48: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'448: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'271: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'212: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'467: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'101: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'279: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'485: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'420: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'198: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'476: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'233: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'434: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'277: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'216: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'249: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'478: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'422: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'486: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 23, 969000, tzinfo=datetime.timezone.utc), ''), (b'251: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'475: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'48: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'448: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'271: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'198: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'101: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'279: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'485: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'420: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'216: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'249: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'478: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), ''), (b'422: 1', {}, {}, DatetimeWithNanoseconds(2022, 11, 24, 2, 16, 21, 322000, tzinfo=datetime.timezone.utc), '')] ================== 1 failed, 5 warnings in 779.99s (0:12:59) =================== > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_streaming_V2 FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 81 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py310:preCommitIT_streaming_V2'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 20m 21s 129 actionable tasks: 119 executed, 8 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/lflgkourfvzbq Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
