See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/2511/display/redirect>

Changes:


------------------------------------------
[...truncated 11.78 MB...]
[gw0] PASSED 
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_imagenet_image_segmentation
 
[gw0] SKIPPED 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_imagenet_image_segmentation
 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification
 
[gw0] SKIPPED 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification
 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification_large_model
 
[gw0] SKIPPED 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_classification_large_model
 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_with_weights_classification
 
[gw0] SKIPPED 
apache_beam/ml/inference/tensorflow_inference_it_test.py::TensorflowInference::test_tf_mnist_with_weights_classification
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
 

=================================== FAILURES ===================================
___________________ BigtableIOWriteTest.test_bigtable_write 
____________________
[gw5] linux -- Python 3.9.10 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

self = <apache_beam.examples.cookbook.bigtableio_it_test.BigtableIOWriteTest 
testMethod=test_bigtable_write>

    @pytest.mark.it_postcommit
    def test_bigtable_write(self):
      number = self.number
      pipeline_args = self.test_pipeline.options_list
      pipeline_options = PipelineOptions(pipeline_args)
    
      with beam.Pipeline(options=pipeline_options) as pipeline:
        config_data = {
            'project_id': self.project,
            'instance_id': self.instance_id,
            'table_id': self.table_id
        }
>       _ = (
            pipeline
            | 'Generate Direct Rows' >> GenerateTestRows(number, **config_data))

apache_beam/examples/cookbook/bigtableio_it_test.py:189: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:608: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:558: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:585: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20231104002359650808-6630'
 createTime: '2023-11-04T00:24:05.795796Z'
...023-11-04T00:24:05.795796Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f6ac627f160>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >4500.0s

apache_beam/runners/dataflow/dataflow_runner.py:758: Failed
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:761 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmp23_siy17/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/build/apache_beam-2.53.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20231023
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20231023" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f6b21ea1b80> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f6b21eac3a0> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/seaborn-0.13.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/seaborn-0.13.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/beautifulsoup4-4.12.2-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/beautifulsoup4-4.12.2-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/tfx_bsl-1.14.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/tfx_bsl-1.14.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/matplotlib-3.8.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/matplotlib-3.8.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/apache_beam-2.53.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/apache_beam-2.53.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1104002359-649489-0k4kdurm.1699057439.649689/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231104002359650808-6630'
 createTime: '2023-11-04T00:24:05.795796Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-11-03_17_24_05-16059708242659907448'
 location: 'us-central1'
 name: 'beamapp-jenkins-1104002359-649489-0k4kdurm'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-11-04T00:24:05.795796Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-11-03_17_24_05-16059708242659907448]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-11-03_17_24_05-16059708242659907448
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-03_17_24_05-16059708242659907448?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-03_17_24_05-16059708242659907448?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-11-03_17_24_05-16059708242659907448 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:24:08.770Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:24:10.957Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:24:11.011Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:24:11.429Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:24:20.158Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at 
core.py:3774>)+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:24:25.706Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:27:14.247Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:27:14.651Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at 
core.py:3774>)+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:27:14.691Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:27:15.317Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T00:27:15.361Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
 Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct 
Rows/Create/Map(decode)+Generate Direct 
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T01:38:51.887Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2023-11-03_17_24_05-16059708242659907448.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T01:38:51.909Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
 Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct 
Rows/Create/Map(decode)+Generate Direct 
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-04T01:38:52.058Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-11-03_17_24_05-16059708242659907448 is in state JOB_STATE_CANCELLING
WARNING  apache_beam.utils.retry:retry.py:290 Retry with exponential 
backoff: waiting for 2.9163904418561986 seconds before retrying list_messages 
because we caught exception: AttributeError: 'NoneType' object has no attribute 
'readline'
 Traceback for above exception (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 275, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 1000, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 550, in List
    return self._RunMethod(config, request, global_params=global_params)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/base_api.py";,>
 line 728, in _RunMethod
    http_response = http_wrapper.MakeRequest(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/http_wrapper.py";,>
 line 359, in MakeRequest
    retry_func(ExceptionRetryArgs(http, http_request, e, retry,
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/http_wrapper.py";,>
 line 304, in HandleExceptionsAndRebuildHttpConnections
    raise retry_args.exc
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/http_wrapper.py";,>
 line 348, in MakeRequest
    return _MakeRequestNoRetry(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/http_wrapper.py";,>
 line 397, in _MakeRequestNoRetry
    info, content = http.request(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/google_auth_httplib2.py";,>
 line 218, in request
    response, content = self.http.request(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/httplib2/__init__.py";,>
 line 1724, in request
    (response, content) = self._request(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/httplib2/__init__.py";,>
 line 1444, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, 
headers)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/httplib2/__init__.py";,>
 line 1425, in _conn_request
    content = response.read()
  File "/usr/lib/python3.9/http/client.py", line 470, in read
    return self._readall_chunked()
  File "/usr/lib/python3.9/http/client.py", line 577, in _readall_chunked
    chunk_left = self._get_chunk_left()
  File "/usr/lib/python3.9/http/client.py", line 565, in _get_chunk_left
    self._read_and_discard_trailer()
  File "/usr/lib/python3.9/http/client.py", line 538, in 
_read_and_discard_trailer
    line = self.fp.readline(_MAXLINE + 1)
=============================== warnings summary 
===============================
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: The default value of numeric_only in DataFrame.mean is 
deprecated. In a future version, it will default to False. In addition, 
specifying 'numeric_only=None' is deprecated. Select only valid columns or 
specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
 - Failed: Timeout >4500.0s
====== 1 failed, 87 passed, 50 skipped, 
9 warnings in 7148.69s (1:59:08) =======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 2h 5m
219 actionable tasks: 157 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/kmb6vuljdt4b6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to