See 
<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/1705/display/redirect>

Changes:


------------------------------------------
[...truncated 174.01 KB...]
Using cached 
PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (738 kB)
Using cached 
regex-2023.10.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (773 
kB)
Using cached requests-2.31.0-py3-none-any.whl (62 kB)
Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Using cached 
scikit_learn-1.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 
(10.9 MB)
Using cached 
SQLAlchemy-1.4.50-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (1.6 MB)
Using cached tenacity-8.2.3-py3-none-any.whl (24 kB)
Using cached typing_extensions-4.9.0rc1-py3-none-any.whl (32 kB)
Using cached 
zstandard-0.22.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.4 
MB)
Using cached botocore-1.33.4-py3-none-any.whl (11.8 MB)
Using cached certifi-2023.11.17-py3-none-any.whl (162 kB)
Using cached 
cffi-1.16.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (443 kB)
Using cached 
charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (142 kB)
Using cached dnspython-2.4.2-py3-none-any.whl (300 kB)
Using cached docker-7.0.0b2-py3-none-any.whl (147 kB)
Using cached exceptiongroup-1.2.0-py3-none-any.whl (16 kB)
Using cached execnet-2.0.2-py3-none-any.whl (37 kB)
Using cached google_cloud_resource_manager-1.10.4-py2.py3-none-any.whl (320 kB)
Using cached google_resumable_media-2.6.0-py2.py3-none-any.whl (80 kB)
Using cached googleapis_common_protos-1.61.0-py2.py3-none-any.whl (230 kB)
Using cached 
greenlet-3.0.1-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (610 
kB)
Using cached grpc_google_iam_v1-0.13.0rc1-py2.py3-none-any.whl (25 kB)
Using cached grpcio_status-1.60.0rc1-py3-none-any.whl (14 kB)
Using cached idna-3.6-py3-none-any.whl (61 kB)
Using cached jsonschema_specifications-2023.11.1-py3-none-any.whl (17 kB)
Using cached msal-1.25.0-py2.py3-none-any.whl (97 kB)
Using cached pyparsing-3.1.1-py3-none-any.whl (103 kB)
Using cached referencing-0.31.1-py3-none-any.whl (25 kB)
Using cached 
rpds_py-0.13.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.2 MB)
Using cached s3transfer-0.8.2-py3-none-any.whl (82 kB)
Using cached 
scipy-1.11.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (36.6 MB)
Using cached 
shapely-2.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.5 MB)
Using cached threadpoolctl-3.2.0-py3-none-any.whl (15 kB)
Using cached tzlocal-5.2-py3-none-any.whl (17 kB)
Using cached urllib3-1.26.18-py2.py3-none-any.whl (143 kB)
Using cached PyMySQL-1.1.0-py3-none-any.whl (44 kB)
Using cached 
wrapt-1.16.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (80 kB)
Using cached portalocker-2.8.2-py3-none-any.whl (17 kB)
Using cached pyasn1-0.5.1-py2.py3-none-any.whl (84 kB)
Using cached PyJWT-2.8.0-py3-none-any.whl (22 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (pyproject.toml): started
  Building wheel for apache-beam (pyproject.toml): still running...
  Building wheel for apache-beam (pyproject.toml): still running...
  Building wheel for apache-beam (pyproject.toml): still running...
  Building wheel for apache-beam (pyproject.toml): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.53.0.dev0-cp39-cp39-linux_x86_64.whl size=15027146 
sha256=6701da893c5f659bdecbf141ebd06ba6215b8f2163232da451cd983b501b8c31
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/b7/13/80/cb857c428d80896b94d03fd609443f79db9c6f16d6cd883a32
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyjsparser, docopt, 
crcmod, zstandard, wrapt, urllib3, tzlocal, typing-extensions, threadpoolctl, 
tenacity, sqlparse, six, rpds-py, regex, pyyaml, pyparsing, pymysql, PyJWT, 
pyhamcrest, pycparser, pyasn1, pyarrow-hotfix, psycopg2-binary, protobuf, 
portalocker, parameterized, overrides, orjson, objsize, numpy, mock, joblib, 
jmespath, iniconfig, idna, grpcio, greenlet, google-crc32c, fasteners, 
fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, 
charset-normalizer, certifi, attrs, sqlalchemy, shapely, scipy, rsa, requests, 
referencing, python-dateutil, pytest, pymongo, pydot, pyasn1-modules, pyarrow, 
proto-plus, js2py, isodate, hypothesis, httplib2, googleapis-common-protos, 
google-resumable-media, cffi, scikit-learn, requests-mock, pytest-xdist, 
pytest-timeout, pandas, oauth2client, jsonschema-specifications, hdfs, 
grpcio-status, google-auth, freezegun, docker, cryptography, botocore, 
azure-core, testcontainers, s3transfer, jsonschema, grpc-google-iam-v1, 
google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, 
msal, google-cloud-core, boto3, apache-beam, msal-extensions, 
google-cloud-vision, google-cloud-videointelligence, google-cloud-storage, 
google-cloud-spanner, google-cloud-resource-manager, 
google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, 
google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, 
google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, 
google-cloud-aiplatform, azure-identity
Successfully installed PyJWT-2.8.0 apache-beam-2.53.0.dev0 attrs-23.1.0 
azure-core-1.29.5 azure-identity-1.15.0 azure-storage-blob-12.19.0 boto3-1.33.4 
botocore-1.33.4 certifi-2023.11.17 cffi-1.16.0 charset-normalizer-3.3.2 
cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.7 deprecation-2.1.0 dill-0.3.1.1 
dnspython-2.4.2 docker-7.0.0b2 docopt-0.6.2 exceptiongroup-1.2.0 execnet-2.0.2 
fastavro-1.9.0 fasteners-0.19 freezegun-1.2.2 google-api-core-2.14.0 
google-apitools-0.5.31 google-auth-2.23.4 google-auth-httplib2-0.1.1 
google-cloud-aiplatform-1.36.4 google-cloud-bigquery-3.13.0 
google-cloud-bigquery-storage-2.23.0 google-cloud-bigtable-2.21.0 
google-cloud-core-2.3.3 google-cloud-datastore-2.18.0 google-cloud-dlp-3.13.0 
google-cloud-language-2.11.1 google-cloud-pubsub-2.18.4 
google-cloud-pubsublite-1.8.3 google-cloud-recommendations-ai-0.10.5 
google-cloud-resource-manager-1.10.4 google-cloud-spanner-3.40.1 
google-cloud-storage-2.13.0 google-cloud-videointelligence-2.11.4 
google-cloud-vision-3.4.5 google-crc32c-1.5.0 google-resumable-media-2.6.0 
googleapis-common-protos-1.61.0 greenlet-3.0.1 grpc-google-iam-v1-0.13.0rc1 
grpcio-1.60.0rc1 grpcio-status-1.60.0rc1 hdfs-2.7.3 httplib2-0.22.0 
hypothesis-6.91.0 idna-3.6 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 
joblib-1.3.2 js2py-0.74 jsonschema-4.20.0 jsonschema-specifications-2023.11.1 
mock-5.1.0 msal-1.25.0 msal-extensions-1.0.0 numpy-1.24.4 oauth2client-4.1.3 
objsize-0.6.1 orjson-3.9.10 overrides-6.5.0 pandas-1.5.3 parameterized-0.9.0 
portalocker-2.8.2 proto-plus-1.23.0rc1 protobuf-4.25.1 psycopg2-binary-2.9.9 
pyarrow-11.0.0 pyarrow-hotfix-0.6 pyasn1-0.5.1 pyasn1-modules-0.3.0 
pycparser-2.21 pydot-1.4.2 pyhamcrest-2.1.0 pyjsparser-2.7.1 pymongo-4.6.1 
pymysql-1.1.0 pyparsing-3.1.1 pytest-7.4.3 pytest-timeout-2.2.0 
pytest-xdist-3.5.0 python-dateutil-2.8.2 pytz-2023.3.post1 pyyaml-6.0.1 
referencing-0.31.1 regex-2023.10.3 requests-2.31.0 requests-mock-1.11.0 
rpds-py-0.13.2 rsa-4.9 s3transfer-0.8.2 scikit-learn-1.3.2 scipy-1.11.4 
shapely-2.0.2 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.50 
sqlparse-0.4.4 tenacity-8.2.3 testcontainers-3.7.1 threadpoolctl-3.2.0 
typing-extensions-4.9.0rc1 tzlocal-5.2 urllib3-1.26.18 wrapt-1.16.0 
zstandard-0.22.0

> Task :sdks:python:test-suites:dataflow:py39:initializeForDataflowJob

> Task :sdks:python:test-suites:dataflow:py39:postCommitSickbay
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --region=us-central1 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>  --num_workers=1 --sleep_secs=20 
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes 
>>> --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=it_postcommit_sickbay
============================= test session starts 
==============================
platform linux -- Python 3.9.10, pytest-7.4.3, pluggy-1.3.0
rootdir: 
<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/sdks/python>
configfile: pytest.ini
plugins: timeout-2.2.0, requests-mock-1.11.0, hypothesis-6.91.0, xdist-3.5.0
timeout: 4500.0s
timeout method: signal
timeout func_only: False
created: 8/8 workers
8 workers [1 item]

scheduling tests via LoadFileScheduling

apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
 
+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

~~~~~~~~~~~~~~~~~~~~~ Stack of Thread-10 (140036871812864) ~~~~~~~~~~~~~~~~~~~~~
  File "/usr/lib/python3.9/threading.py", line 930, in _bootstrap
    self._bootstrap_inner()
  File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.9/threading.py", line 910, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 171, in poll_for_job_completion
    time.sleep(sleep_secs)

~~~~~~~~~~~~~~~~~~~~~ Stack of <unknown> (140040109123328) ~~~~~~~~~~~~~~~~~~~~~
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py";,>
 line 361, in _perform_spawn
    reply.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py";,>
 line 296, in run
    self._result = func(*args, **kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py";,>
 line 1049, in _thread_receiver
    msg = Message.from_io(io)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py";,>
 line 507, in from_io
    header = io.read(9)  # type 1, channel 4, payload 4
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/execnet/gateway_base.py";,>
 line 474, in read
    data = self._read(numbytes - len(buf))

+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

[gw0] FAILED 
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
 

=================================== FAILURES ===================================
___________________ BigtableIOWriteTest.test_bigtable_write 
____________________
[gw0] linux -- Python 3.9.10 
<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

self = <apache_beam.examples.cookbook.bigtableio_it_test.BigtableIOWriteTest 
testMethod=test_bigtable_write>

    @pytest.mark.it_postcommit_sickbay
    def test_bigtable_write(self):
      number = self.number
      pipeline_args = self.test_pipeline.options_list
      pipeline_options = PipelineOptions(pipeline_args)
    
      with beam.Pipeline(options=pipeline_options) as pipeline:
        config_data = {
            'project_id': self.project,
            'instance_id': self.instance_id,
            'table_id': self.table_id
        }
>       _ = (
            pipeline
            | 'Generate Direct Rows' >> GenerateTestRows(number, **config_data))

apache_beam/examples/cookbook/bigtableio_it_test.py:191: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:608: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:558: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:585: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20231130064518039261-6913'
 createTime: '2023-11-30T06:45:19.147332Z'
...023-11-30T06:45:19.147332Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f5d7f16c6d0>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >4500.0s

apache_beam/runners/dataflow/dataflow_runner.py:758: Failed
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20231116
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20231116" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f5d92696ee0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f5d92692700> 
====================
INFO     apache_beam.internal.gcp.auth:auth.py:134 Setting socket 
default timeout to 60 seconds.
INFO     apache_beam.internal.gcp.auth:auth.py:136 socket default 
timeout is 60.0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:673 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130064518-038061-mnd2bjfv.1701326718.038229/dataflow_python_sdk.tar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:683 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130064518-038061-mnd2bjfv.1701326718.038229/dataflow_python_sdk.tar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:673 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130064518-038061-mnd2bjfv.1701326718.038229/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:683 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1130064518-038061-mnd2bjfv.1701326718.038229/pipeline.pb
 in 0 seconds.
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:338 
Unknown pipeline options received: 
--sleep_secs=20,--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test.
 Ignore if flags are used for internal purposes.
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:338 
Unknown pipeline options received: 
--sleep_secs=20,--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test.
 Ignore if flags are used for internal purposes.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:854 Create job: 
<Job
 clientRequestId: '20231130064518039261-6913'
 createTime: '2023-11-30T06:45:19.147332Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-11-29_22_45_18-7646815693168644958'
 location: 'us-central1'
 name: 'beamapp-jenkins-1130064518-038061-mnd2bjfv'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-11-30T06:45:19.147332Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:856 Created job 
with id: [2023-11-29_22_45_18-7646815693168644958]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:857 Submitted job: 
2023-11-29_22_45_18-7646815693168644958
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:858 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-29_22_45_18-7646815693168644958?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-29_22_45_18-7646815693168644958?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-11-29_22_45_18-7646815693168644958 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:45:22.389Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:45:24.792Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:45:24.861Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:45:25.132Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:45:34.114Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at 
core.py:3774>)+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:45:54.287Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:48:50.606Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:48:51.086Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at 
core.py:3774>)+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:48:51.161Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:48:52.454Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T06:48:52.501Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
 Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct 
Rows/Create/Map(decode)+Generate Direct 
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T08:00:09.457Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2023-11-29_22_45_18-7646815693168644958.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T08:00:09.535Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
 Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct 
Rows/Create/Map(decode)+Generate Direct 
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-30T08:00:09.957Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-11-29_22_45_18-7646815693168644958 is in state JOB_STATE_CANCELLING
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
 - Failed: Timeout >4500.0s
================== 1 failed, 17 skipped in 
4670.69s (1:17:50) ==================

> Task :sdks:python:test-suites:dataflow:py39:postCommitSickbay FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Sickbay_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 181

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py39:postCommitSickbay'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 1h 23m 17s
9 actionable tasks: 8 executed, 1 from cache

Publishing build scan...
https://ge.apache.org/s/aye5wavbdg6pu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to