See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/2391/display/redirect>

Changes:


------------------------------------------
[...truncated 16.21 MB...]
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20230927
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20230927" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f00cadb8e50> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f00cadb9670> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/seaborn-0.13.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/seaborn-0.13.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 2 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011642-932093-3mbqqd2a.1696468602.932503/pipeline.pb
 in 0 seconds.
_ 
ReadUsingStorageApiTests.test_iobase_source_with_column_selection_and_row_restriction
 _
[gw7] linux -- Python 3.9.10 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9>

self = <apache_beam.io.gcp.bigquery_read_it_test.ReadUsingStorageApiTests 
testMethod=test_iobase_source_with_column_selection_and_row_restriction>

    @pytest.mark.it_postcommit
    def test_iobase_source_with_column_selection_and_row_restriction(self):
      EXPECTED_TABLE_DATA = [{'string': 'привет'}]
      with beam.Pipeline(argv=self.args) as p:
        result = (
            p | 'Read with BigQuery Storage API' >> beam.io.ReadFromBigQuery(
                method=beam.io.ReadFromBigQuery.Method.DIRECT_READ,
                table=self.temp_table_reference,
                row_restriction='number > 2',
                selected_fields=['string']))
>       assert_that(result, equal_to(EXPECTED_TABLE_DATA))

apache_beam/io/gcp/bigquery_read_it_test.py:524: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:596: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:546: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:573: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:53: in 
run_pipeline
    self.result = super().run_pipeline(pipeline, options)
apache_beam/runners/dataflow/dataflow_runner.py:479: in 
run_pipeline
    self.dataflow_client.create_job(self.job), self)
apache_beam/utils/retry.py:275: in wrapper
    return fun(*args, **kwargs)
apache_beam/runners/dataflow/internal/apiclient.py:720: in 
create_job
    return self.submit_job_description(job)
apache_beam/utils/retry.py:275: in wrapper
    return fun(*args, **kwargs)
apache_beam/runners/dataflow/internal/apiclient.py:820: in 
submit_job_description
    response = self._client.projects_locations_jobs.Create(request)
apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py:722:
 in Create
    return self._RunMethod(config, request, global_params=global_params)
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/base_api.py:731:
 in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/base_api.py:737:
 in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = 
<apache_beam.runners.dataflow.internal.clients.dataflow.dataflow_v1b3_client.DataflowV1b3.ProjectsLocationsJobsService
 object at 0x7f1a49313b50>
method_config = <ApiMethodInfo
 relative_path: 'v1b3/projects/{projectId}/locations/{location}/jobs'
 method_id: 'dataflow.projects.lo...DataflowProjectsLocationsJobsCreateRequest'
 response_type_name: 'Job'
 request_field: 'job'
 supports_download: False>
http_response = Response(info={'vary': 'Origin, X-Origin, Referer', 
'content-type': 'application/json; charset=UTF-8', 'date': 'Thu, 0...', 
request_url='https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json')
request = <DataflowProjectsLocationsJobsCreateRequest
 job: <Job
 clientRequestId: '20231005011712503728-6450'
 environment: <En...empFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
 location: 'us-central1'
 projectId: 'apache-beam-testing'>

    def __ProcessHttpResponse(self, method_config, http_response, request):
        """Process the given http response."""
        if http_response.status_code not in (http_client.OK,
                                             http_client.CREATED,
                                             http_client.NO_CONTENT):
>           raise exceptions.HttpError.FromResponse(
                http_response, method_config=method_config, request=request)
E           apitools.base.py.exceptions.HttpBadRequestError: HttpError 
accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Thu, 05 Oct 2023 01:17:18 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '400', 'content-length': '491', 
'-content-encoding': 'gzip'}>, content <{
E             "error": {
E               "code": 400,
E               "message": "(71e88d8c4cfc99b): The workflow could not 
be created. Causes: (e28e3c6af1076cd5): Dataflow quota error for 
jobs-per-project quota. Project apache-beam-testing is running 300 jobs. Please 
check the quota usage via GCP Console. If it exceeds the limit, please wait for 
a workflow to finish or contact Google Cloud Support to request an increase in 
quota. If it does not, contact Google Cloud Support.",
E               "status": "FAILED_PRECONDITION"
E             }
E           }
E           >

../../build/gradleenv/-1734967050/lib/python3.9/site-packages/apitools/base/py/base_api.py:603:
 HttpBadRequestError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:761 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/bin/python3.9',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpdlh1x26m/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp39', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20230927
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.9_sdk:beam-master-20230927" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f1a7c21f0d0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f1a7c21f8b0> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/seaborn-0.13.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/seaborn-0.13.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/scikit_learn-1.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 2 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/apache_beam-2.52.0.dev0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1005011712-502580-vi5a50ku.1696468632.502771/pipeline.pb
 in 0 seconds.
=============================== warnings summary 
===============================
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:28
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py>:28:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: The default value of numeric_only in DataFrame.mean is 
deprecated. In a future version, it will default to False. In addition, 
specifying 'numeric_only=None' is deprecated. Select only valid columns or 
specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/pytest_postCommitIT-df-py39.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification
 - apitools.base.py.exceptions.HttpBadRequestError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Thu, 05 Oct 2023 01:16:48 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '400', 'content-length': '492', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 400,
    "message": "(57d29d880b6db36c): The workflow could not be created. Causes: 
(9c7236c822b86429): Dataflow quota error for jobs-per-project quota. Project 
apache-beam-testing is running 301 jobs. Please check the quota usage via GCP 
Console. If it exceeds the limit, please wait for a workflow to finish or 
contact Google Cloud Support to request an increase in quota. If it does not, 
contact Google Cloud Support.",
    "status": "FAILED_PRECONDITION"
  }
}
>
FAILED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction
 - apitools.base.py.exceptions.HttpBadRequestError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Thu, 05 Oct 2023 01:17:18 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '400', 'content-length': '491', 
'-content-encoding': 'gzip'}>, content <{
  "error": {
    "code": 400,
    "message": "(71e88d8c4cfc99b): The workflow could not be created. Causes: 
(e28e3c6af1076cd5): Dataflow quota error for jobs-per-project quota. Project 
apache-beam-testing is running 300 jobs. Please check the quota usage via GCP 
Console. If it exceeds the limit, please wait for a workflow to finish or 
contact Google Cloud Support to request an increase in quota. If it does not, 
contact Google Cloud Support.",
    "status": "FAILED_PRECONDITION"
  }
}
>
====== 2 failed, 86 passed, 49 skipped, 
18 warnings in 6622.32s (1:50:22) ======

> Task :sdks:python:test-suites:dataflow:py39:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 326

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py39:postCommitPy39IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 52

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python39/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 139

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py39:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 1h 57m 3s
219 actionable tasks: 153 executed, 60 from cache, 6 up-to-date

Publishing build scan...
https://ge.apache.org/s/l7axm2kfyxie2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to