See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/553/display/redirect>

Changes:


------------------------------------------
[...truncated 744.59 KB...]
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >4500.0s

apache_beam/runners/dataflow/dataflow_runner.py:752: Failed

During handling of the above exception, another exception occurred:

self = 
<apache_beam.io.external.xlang_bigqueryio_it_test.BigQueryXlangStorageWriteIT 
testMethod=test_streaming_with_at_least_once>

    def test_streaming_with_at_least_once(self):
      table = 'streaming'
>     self.run_streaming(table_name=table, use_at_least_once=True)

apache_beam/io/external/xlang_bigqueryio_it_test.py:284: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/io/external/xlang_bigqueryio_it_test.py:263: in 
run_streaming
    with beam.Pipeline(argv=args) as p:
apache_beam/pipeline.py:600: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:577: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:74: in 
run_pipeline
    self.wait_until_in_state(PipelineState.CANCELLED)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <apache_beam.runners.dataflow.test_dataflow_runner.TestDataflowRunner 
object at 0x7f9d3928c790>
expected_state = 'CANCELLED', timeout = 600

    def wait_until_in_state(self, expected_state, 
timeout=WAIT_IN_STATE_TIMEOUT):
      """Wait until Dataflow pipeline enters a certain state."""
      consoleUrl = (
          "Console URL: https://console.cloud.google.com/dataflow/";
          f"<regionId>/{self.result.job_id()}?project=<projectId>")
      if not self.result.has_job:
        _LOGGER.error(consoleUrl)
        raise IOError('Failed to get the Dataflow job id.')
    
      start_time = time.time()
      while time.time() - start_time <= timeout:
        job_state = self.result.state
        if self.result.is_in_terminal_state() or job_state == expected_state:
          return job_state
        time.sleep(5)
      _LOGGER.error(consoleUrl)
>     raise RuntimeError(
          'Timeout after %d seconds while waiting for job %s '
          'enters expected state %s. Current state is %s.' %
          (timeout, self.result.job_id(), expected_state, self.result.state))
E     RuntimeError: Timeout after 600 seconds while waiting for job 
2023-08-13_01_28_22-16027838719605767462 enters expected state CANCELLED. 
Current state is CANCELLING.

apache_beam/runners/dataflow/test_dataflow_runner.py:103: 
RuntimeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:810 
Dataset apache-beam-testing:python_xlang_storage_write_1691915289_ddee63 does 
not exist so we will create it as temporary with location=None
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:112
 Created dataset python_xlang_storage_write_1691915289_ddee63 in project 
apache-beam-testing
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:115
 expansion port: 36647
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.51.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:397 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:404 Defaulting to 
the temp_location as staging_location: 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/beam-sdks-java-io-google-cloud-platform-expansion-service-2.51.0-SNAPSHOT-L3j7pwaRqSuOeYUcMn75AJoB-73-1e0rCAr0noNd220.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/beam-sdks-java-io-google-cloud-platform-expansion-service-2.51.0-SNAPSHOT-L3j7pwaRqSuOeYUcMn75AJoB-73-1e0rCAr0noNd220.jar
 in 5 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/apache_beam-2.51.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/apache_beam-2.51.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0813082815-124931-1pmc5z6q.1691915295.125386/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20230813082815126168-4304'
 createTime: '2023-08-13T08:28:23.071820Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-08-13_01_28_22-16027838719605767462'
 location: 'us-central1'
 name: 'beamapp-jenkins-0813082815-124931-1pmc5z6q'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-08-13T08:28:23.071820Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-08-13_01_28_22-16027838719605767462]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-08-13_01_28_22-16027838719605767462
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-13_01_28_22-16027838719605767462?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-13_01_28_22-16027838719605767462?project=apache-beam-testing
WARNING  
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:65 
Waiting indefinitely for streaming job.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-08-13_01_28_22-16027838719605767462 is in state JOB_STATE_RUNNING
WARNING  
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:206 
2023-08-13T08:28:24.628Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for 
Dataflow Streaming Engine. Workers will scale between 1 and 100 unless 
maxNumWorkers is specified.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:28:29.509Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:28:31.689Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:28:32.004Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:28:50.850Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:33:32.094Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:33:32.156Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-08-13T08:35:37.482Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-08-13_01_28_22-16027838719605767462 is in state JOB_STATE_CANCELLING
ERROR    
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:102 
Console URL: 
https://console.cloud.google.com/dataflow/<regionId>/2023-08-13_01_28_22-16027838719605767462?project=<projectId>
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:120
 Deleting dataset python_xlang_storage_write_1691915289_ddee63 in project 
apache-beam-testing
=============================== warnings summary 
===============================
apache_beam/runners/portability/stager.py:63
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py>:63:
 DeprecationWarning: pkg_resources is deprecated as an API. See 
https://setuptools.pypa.io/en/latest/pkg_resources.html
    import pkg_resources

../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2871:
 20 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2871:
 16 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.cloud')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2350
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2350:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(parent)

../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2871
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.logging')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py:2871
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.iam')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/2050596099/lib/python3.11/site-packages/hdfs/config.py:28
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/hdfs/config.py>:28:
 DeprecationWarning: the imp module is deprecated in favour of importlib and 
slated for removal in Python 3.12; see the module's documentation for 
alternative uses
    from imp import load_source

../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/rpc/__init__.py:20
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/rpc/__init__.py>:20:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.rpc')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    pkg_resources.declare_namespace(__name__)

../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/examples/snippets/snippets_test.py>:767:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location = 'gs://mylocation'

apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2101:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2107:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/snippets/snippets_test.py: 2 warnings
apache_beam/io/external/xlang_bigqueryio_it_test.py: 10 warnings
apache_beam/io/gcp/bigtableio_it_test.py: 6 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/transforms/external.py>:676:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    self._expansion_service, pipeline.options)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
 - RuntimeError: Timeout after 600 seconds while waiting for job 
2023-08-13_01_28_22-16027838719605767462 enters expected state CANCELLED. 
Current state is CANCELLING.
= 1 failed, 13 passed, 14 skipped, 
6943 deselected, 84 warnings in 10053.52s 
(2:47:33) =

> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava 
> FAILED

> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 4142293.
Skipping invalid pid: 4142294.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 4141177

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 54m 20s
119 actionable tasks: 83 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/3td7oqrqnrxis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to