See <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/148/display/redirect?page=changes>
Changes: [chamikaramj] Update PythonMap transform to accept extra packages [chamikaramj] Update the test [chamikaramj] Address reviewer comments [chamikaramj] Copy environment capabilities when creating the WorkerPool for Java [noreply] Remove ValueProvider from BigtableIO ReadChangeStream (#25409) [noreply] Annotate Cloud Bigtable implementation details as Internal (#25403) [noreply] Add dependencies in some examples (#25425) [noreply] Add batching args to ModelHandlers docs (#25398) [noreply] Data sampling proto (#25421) [noreply] Support ONNX runtime in RunInference API (#24911) [noreply] Fix UpdateSchemaDestination breaking DynamicDestination in Bigquery [noreply] Fix whitespace (#25432) ------------------------------------------ [...truncated 84.43 KB...] steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:900 Created job with id: [2023-02-10_14_38_09-15265121118596866871] INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:901 Submitted job: 2023-02-10_14_38_09-15265121118596866871 INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:902 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-10_14_38_09-15265121118596866871?project=apache-beam-testing INFO apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: INFO apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-10_14_38_09-15265121118596866871?project=apache-beam-testing INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-02-10_14_38_09-15265121118596866871 is in state JOB_STATE_RUNNING INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:12.226Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:12.371Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2023-02-10_14_38_09-15265121118596866871. The number of workers will be between 1 and 100. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:12.414Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2023-02-10_14_38_09-15265121118596866871. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:18.395Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.547Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.581Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.661Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.695Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.734Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.764Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.837Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.904Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.952Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:21.990Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.013Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.045Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.081Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.116Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn) INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.150Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.183Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.216Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.245Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.279Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into encode INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.301Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write into WriteToPubSub/ToProtobuf INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.342Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.378Z: JOB_MESSAGE_BASIC: Using cloud KMLS key to protect persistent state. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.497Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.529Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.559Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:22.593Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:23.688Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:23.713Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:23.769Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:38:44.676Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:39:08.564Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:39:39.837Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-10T22:39:50.336Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. WARNING apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:249 Timing out on waiting for job 2023-02-10_14_38_09-15265121118596866871 after 364 seconds =============================== warnings summary =============================== ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py310.xml> - =========================== short test summary info ============================ FAILED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it - AssertionError: Expected: (Test pipeline expected terminated in state: RUNNING and Expected 500 messages.) but: Expected 500 messages. Got 516 messages. Diffs (item, count): Expected but not in actual: dict_items([]) Unexpected: dict_items([(b'104: 1', 1), (b'125: 1', 1), (b'373: 1', 1), (b'349: 1', 1), (b'91: 1', 1), (b'260: 1', 1), (b'423: 1', 1), (b'292: 1', 1), (b'447: 1', 1), (b'183: 1', 1), (b'413: 1', 1), (b'208: 1', 1), (b'307: 1', 1), (b'326: 1', 1), (b'172: 1', 1), (b'83: 1', 1)]) Unexpected (with all details): [(b'104: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'125: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'373: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'349: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'91: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'260: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'423: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'292: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'447: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'183: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'413: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'208: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'307: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'326: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'172: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'83: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 55, 534000, tzinfo=datetime.timezone.utc), ''), (b'104: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'125: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'373: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'349: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'91: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'260: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'423: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'292: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'447: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'183: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'413: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'208: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'307: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'326: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'172: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), ''), (b'83: 1', {}, {}, DatetimeWithNanoseconds(2023, 2, 10, 22, 47, 56, 408000, tzinfo=datetime.timezone.utc), '')] ================== 1 failed, 3 warnings in 849.22s (0:14:09) =================== > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_streaming FAILED > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_batch_V2 >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --region=us-central1 >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 --experiments=use_runner_v2 --experiments=beam_fn_api >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> pytest options: >>> apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it >>> --capture=no --numprocesses=2 --timeout=1800 >>> collect markers: ============================= test session starts ============================== platform linux -- Python 3.10.2, pytest-7.2.1, pluggy-1.0.0 rootdir: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python,> configfile: pytest.ini plugins: hypothesis-6.68.0, xdist-2.5.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.10.0 timeout: 1800.0s timeout method: signal timeout func_only: False gw0 I / gw1 I [gw0] Python 3.10.2 (main, Jan 15 2022, 18:02:07) [GCC 9.3.0] [gw1] Python 3.10.2 (main, Jan 15 2022, 18:02:07) [GCC 9.3.0] gw0 [1] / gw1 [1] scheduling tests via LoadScheduling apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming [gw0] PASSED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it =============================== warnings summary =============================== ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py37.xml> - ================== 1 passed, 3 warnings in 799.41s (0:13:19) =================== > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_batch_V2 >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --region=us-central1 >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 --experiments=use_runner_v2 --experiments=beam_fn_api >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> pytest options: >>> apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it >>> --capture=no --numprocesses=2 --timeout=1800 >>> collect markers: ============================= test session starts ============================== platform linux -- Python 3.7.12, pytest-7.2.1, pluggy-1.0.0 rootdir: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python,> configfile: pytest.ini plugins: hypothesis-6.68.0, xdist-2.5.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.10.0 timeout: 1800.0s timeout method: signal timeout func_only: False gw0 I / gw1 I [gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0] [gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0] gw0 [1] / gw1 [1] scheduling tests via LoadScheduling apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_batch_V2 [gw0] PASSED apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it =============================== warnings summary =============================== ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py310.xml> - ================== 1 passed, 3 warnings in 622.98s (0:10:22) =================== > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_streaming_V2 >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --region=us-central1 >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 --streaming --experiments=use_runner_v2 >>> --enable_streaming_engine >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> pytest options: >>> apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it >>> --capture=no --numprocesses=2 --timeout=1800 >>> collect markers: ============================= test session starts ============================== platform linux -- Python 3.10.2, pytest-7.2.1, pluggy-1.0.0 rootdir: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python,> configfile: pytest.ini plugins: hypothesis-6.68.0, xdist-2.5.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.10.0 timeout: 1800.0s timeout method: signal timeout func_only: False gw0 I / gw1 I [gw1] Python 3.10.2 (main, Jan 15 2022, 18:02:07) [GCC 9.3.0] [gw0] Python 3.10.2 (main, Jan 15 2022, 18:02:07) [GCC 9.3.0] gw0 [1] / gw1 [1] scheduling tests via LoadScheduling apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_batch_V2 [gw0] PASSED apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it =============================== warnings summary =============================== ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py37.xml> - ================== 1 passed, 3 warnings in 621.58s (0:10:21) =================== > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --region=us-central1 >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 --streaming --experiments=use_runner_v2 >>> --enable_streaming_engine >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> pytest options: >>> apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it >>> --capture=no --numprocesses=2 --timeout=1800 >>> collect markers: ============================= test session starts ============================== platform linux -- Python 3.7.12, pytest-7.2.1, pluggy-1.0.0 rootdir: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python,> configfile: pytest.ini plugins: hypothesis-6.68.0, xdist-2.5.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.10.0 timeout: 1800.0s timeout method: signal timeout func_only: False gw0 I / gw1 I [gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0] [gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10) -- [GCC 9.3.0] gw0 [1] / gw1 [1] scheduling tests via LoadScheduling apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it > Task :sdks:python:test-suites:dataflow:py310:preCommitIT_streaming_V2 [gw1] PASSED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it =============================== warnings summary =============================== ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 ../../build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/build/gradleenv/2050596098/lib/python3.10/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib and slated for removal in Python 3.12; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py310.xml> - ================== 1 passed, 3 warnings in 870.09s (0:14:30) =================== > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 [gw0] PASSED apache_beam/examples/streaming_wordcount_it_test.py::StreamingWordCountIT::test_streaming_wordcount_it =============================== warnings summary =============================== ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/pytest_preCommitIT-df-py37.xml> - ================== 1 passed, 3 warnings in 874.94s (0:14:34) =================== > Task :sdks:python:test-suites:dataflow:preCommitIT_V2 FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PreCommit_Python_Integration_Cron/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 75 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py310:preCommitIT_streaming'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 50m 52s 24 actionable tasks: 18 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/rohtvp57rdjm6 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
