See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/2393/display/redirect>

Changes:


------------------------------------------
[...truncated 16.11 MB...]
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_datatable_single_batch
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_multi_batch
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_multi_batch
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch_large_model
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_numpy_single_batch_large_model
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_multi_batch
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_multi_batch
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_single_batch
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_pandas_single_batch
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_multi_batch
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_multi_batch
 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_single_batch
 
[gw2] SKIPPED 
apache_beam/ml/inference/xgboost_inference_it_test.py::XGBoostInference::test_iris_classification_scipy_single_batch
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction_rows
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
 
[gw1] PASSED 
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads 
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
 
[gw1] PASSED 
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts 
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
 
[gw1] PASSED 
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
 
[gw7] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
 

=================================== FAILURES ===================================
_____________ PubSubIntegrationTest.test_streaming_with_attributes 
_____________
[gw2] linux -- Python 3.9.10 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

self = <apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest 
testMethod=test_streaming_with_attributes>

    @pytest.mark.it_postcommit
    def test_streaming_with_attributes(self):
>     self._test_streaming(with_attributes=True)

apache_beam/io/gcp/pubsub_integration_test.py:221: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/io/gcp/pubsub_integration_test.py:209: in 
_test_streaming
    pubsub_it_pipeline.run_pipeline(
apache_beam/io/gcp/pubsub_it_pipeline.py:93: in run_pipeline
    result = p.run()
apache_beam/pipeline.py:573: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:74: in 
run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <apache_beam.runners.dataflow.test_dataflow_runner.TestDataflowRunner 
object at 0x7fb43853fa60>
expected_state = 'CANCELLED', timeout = 600

    def wait_until_in_state(self, expected_state, 
timeout=WAIT_IN_STATE_TIMEOUT):
      """Wait until Dataflow pipeline enters a certain state."""
      consoleUrl = (
          "Console URL: https://console.cloud.google.com/dataflow/";
          f"<regionId>/{self.result.job_id()}?project=<projectId>")
      if not self.result.has_job:
        _LOGGER.error(consoleUrl)
        raise IOError('Failed to get the Dataflow job id.')
    
      start_time = time.time()
      while time.time() - start_time <= timeout:
        job_state = self.result.state
        if self.result.is_in_terminal_state() or job_state == expected_state:
          return job_state
        time.sleep(5)
      _LOGGER.error(consoleUrl)
>     raise RuntimeError(
          'Timeout after %d seconds while waiting for job %s '
          'enters expected state %s. Current state is %s.' %
          (timeout, self.result.job_id(), expected_state, self.result.state))
E     RuntimeError: Timeout after 600 seconds while waiting for job 
2023-10-05_06_10_58-15239512624316186459 enters expected state CANCELLED. 
Current state is CANCELLING.

apache_beam/runners/dataflow/test_dataflow_runner.py:103: 
RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:761 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmppvde4t5m/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20230927
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20230927" for 
Docker environment
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/seaborn-0.13.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/seaborn-0.13.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005131054-175355-906htv3r.1696511454.175551/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231005131054176559-7366'
 createTime: '2023-10-05T13:10:58.997684Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-05_06_10_58-15239512624316186459'
 location: 'us-central1'
 name: 'beamapp-jenkins-1005131054-175355-906htv3r'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-05T13:10:58.997684Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-05_06_10_58-15239512624316186459]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-05_06_10_58-15239512624316186459
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-05_06_10_58-15239512624316186459?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-05_06_10_58-15239512624316186459?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-05_06_10_58-15239512624316186459 is in state JOB_STATE_RUNNING
WARNING  
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:202 
2023-10-05T13:10:59.981Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for 
Dataflow Streaming Engine. Workers will scale between 1 and 100 unless 
maxNumWorkers is specified.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:11:22.147Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:11:25.400Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:11:25.434Z: JOB_MESSAGE_BASIC: Using cloud KMLS key to protect 
persistent state.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:11:25.532Z: JOB_MESSAGE_BASIC: The pubsub read for: 
projects/apache-beam-testing/subscriptions/psit_subscription_input362eaf0b-4af7-4686-b620-2786b16c9834
 is configured to compute input data watermarks based on custom timestamp 
attribute timestamp. Cloud Dataflow has created an additional tracking 
subscription to do this, which will be cleaned up automatically. For details, 
see: 
https://cloud.google.com/dataflow/docs/concepts/streaming-with-cloud-pubsub#high_watermark_accuracy
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:11:25.993Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:11:44.371Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:13:37.856Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:18:17.775Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-05T13:18:34.535Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
WARNING  
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:216 Timing out 
on waiting for job 2023-10-05_06_10_58-15239512624316186459 after 481 seconds
ERROR    
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:102 
Console URL: 
https://console.cloud.google.com/dataflow/<regionId>/2023-10-05_06_10_58-15239512624316186459?project=<projectId>
=============================== warnings summary 
===============================
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py>:28:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: The default value of numeric_only in DataFrame.mean is 
deprecated. In a future version, it will default to False. In addition, 
specifying 'numeric_only=None' is deprecated. Select only valid columns or 
specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes
 - RuntimeError: Timeout after 600 seconds while waiting for job 
2023-10-05_06_10_58-15239512624316186459 enters expected state CANCELLED. 
Current state is CANCELLING.
====== 1 failed, 87 passed, 49 skipped, 
18 warnings in 7769.08s (2:09:29) ======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 326

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py39:postCommitPy39IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 52

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.
==============================================================================

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 2h 16m 8s
213 actionable tasks: 151 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/ndqiu5vqgcfjg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to