See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/853/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #29169 [YAML] fix javascript UDF output in mapping


------------------------------------------
[...truncated 11.93 MB...]
+++++++++++++++++++++++++++++++++++ Timeout ++++++++++++++++++++++++++++++++++++

[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_column_selection_and_row_restriction_rows
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
 
[gw4] FAILED 
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
 
apache_beam/examples/inference/tfx_bsl/tfx_bsl_inference_it_test.py::TFXRunInferenceTests::test_tfx_run_inference_mobilenetv2
 
[gw4] SKIPPED 
apache_beam/examples/inference/tfx_bsl/tfx_bsl_inference_it_test.py::TFXRunInferenceTests::test_tfx_run_inference_mobilenetv2
 
[gw7] PASSED 
apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes
 
apache_beam/ml/gcp/videointelligenceml_test_it.py::VideoIntelligenceMlTestIT::test_label_detection_with_video_context
 
[gw7] SKIPPED 
apache_beam/ml/gcp/videointelligenceml_test_it.py::VideoIntelligenceMlTestIT::test_label_detection_with_video_context
 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py::ExerciseMetricsPipelineTest::test_metrics_it
 
[gw7] SKIPPED 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py::ExerciseMetricsPipelineTest::test_metrics_it
 
[gw5] PASSED 
apache_beam/io/gcp/healthcare/dicomio_integration_test.py::DICOMIoIntegrationTest::test_dicom_store_instance_from_gcs
 
apache_beam/testing/test_stream_it_test.py::TestStreamIntegrationTests::test_basic_execution
 
[gw5] SKIPPED 
apache_beam/testing/test_stream_it_test.py::TestStreamIntegrationTests::test_basic_execution
 
apache_beam/testing/test_stream_it_test.py::TestStreamIntegrationTests::test_multiple_outputs
 
[gw5] SKIPPED 
apache_beam/testing/test_stream_it_test.py::TestStreamIntegrationTests::test_multiple_outputs
 
apache_beam/testing/test_stream_it_test.py::TestStreamIntegrationTests::test_multiple_outputs_with_watermark_advancement
 
[gw5] SKIPPED 
apache_beam/testing/test_stream_it_test.py::TestStreamIntegrationTests::test_multiple_outputs_with_watermark_advancement
 
[gw1] PASSED 
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts 
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_native_datetime
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
 
[gw1] PASSED 
apache_beam/io/gcp/bigquery_test.py::BigQueryFileLoadsIntegrationTests::test_avro_file_load
 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
 
[gw1] PASSED 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification
 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification_large_model
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_query_and_filters
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
 
[gw1] PASSED 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification_large_model
 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
 
[gw1] SKIPPED 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_row_restriction
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadUsingStorageApiTests::test_iobase_source_with_very_selective_filters
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_iobase_source
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadAllBQTests::test_read_queries 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
 
[gw6] PASSED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadInteractiveRunnerTests::test_read_in_interactive_runner
 

=================================== FAILURES ===================================
___________________ BigtableIOWriteTest.test_bigtable_write 
____________________
[gw4] linux -- Python 3.11.2 
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/bin/python3.11>

self = <apache_beam.examples.cookbook.bigtableio_it_test.BigtableIOWriteTest 
testMethod=test_bigtable_write>

    @pytest.mark.it_postcommit
    def test_bigtable_write(self):
      number = self.number
      pipeline_args = self.test_pipeline.options_list
      pipeline_options = PipelineOptions(pipeline_args)
    
>     with beam.Pipeline(options=pipeline_options) as pipeline:

apache_beam/examples/cookbook/bigtableio_it_test.py:183: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/pipeline.py:608: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:561: in run
    self._options).run(False)
apache_beam/pipeline.py:585: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20231101170410081699-6489'
 createTime: '2023-11-01T17:04:52.921233Z'
...023-11-01T17:04:52.921233Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f69945f7b90>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >4500.0s

apache_beam/runners/dataflow/dataflow_runner.py:758: Failed
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:761 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/bin/python3.11',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpjgzqc1si/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp311', '--platform', 
'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.11_sdk:beam-master-20231023
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.11_sdk:beam-master-20231023" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f69c81b87c0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f69c81b9080> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/seaborn-0.13.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/seaborn-0.13.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/transformers-4.34.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/transformers-4.34.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/inflection-0.5.1-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/inflection-0.5.1-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/beautifulsoup4-4.12.2-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/beautifulsoup4-4.12.2-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/torch-2.1.0-cp38-cp38-manylinux1_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/torch-2.1.0-cp38-cp38-manylinux1_x86_64.whl
 in 36 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/torchvision-0.16.0-cp38-cp38-manylinux1_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/torchvision-0.16.0-cp38-cp38-manylinux1_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/Pillow-10.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/Pillow-10.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/matplotlib-3.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/matplotlib-3.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/matplotlib-3.8.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/matplotlib-3.8.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/scikit_learn-1.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/scikit_learn-1.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1101170410-080500-rm23eerb.1698858250.080657/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231101170410081699-6489'
 createTime: '2023-11-01T17:04:52.921233Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-11-01_10_04_52-11058546774289840040'
 location: 'us-central1'
 name: 'beamapp-jenkins-1101170410-080500-rm23eerb'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-11-01T17:04:52.921233Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-11-01_10_04_52-11058546774289840040]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-11-01_10_04_52-11058546774289840040
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-01_10_04_52-11058546774289840040?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-11-01_10_04_52-11058546774289840040?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-11-01_10_04_52-11058546774289840040 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:04:56.394Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:04:58.867Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:04:58.931Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:04:59.434Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:05:08.085Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at 
core.py:3774>)+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:05:14.477Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:08:20.248Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:08:20.666Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/Impulse+Generate Direct Rows/Create/FlatMap(<lambda at 
core.py:3774>)+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:08:20.714Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:08:21.631Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T17:08:21.692Z: JOB_MESSAGE_BASIC: Executing operation Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
 Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct 
Rows/Create/Map(decode)+Generate Direct 
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T18:18:59.908Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2023-11-01_10_04_52-11058546774289840040.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T18:18:59.933Z: JOB_MESSAGE_BASIC: Finished operation Generate Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate 
Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate
 Direct 
Rows/Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate
 Direct Rows/Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate Direct 
Rows/Create/Map(decode)+Generate Direct 
Rows/WriteToBigTable/ParDo(_BigTableWriteFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-11-01T18:19:00.129Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-11-01_10_04_52-11058546774289840040 is in state JOB_STATE_CANCELLING
=============================== warnings summary 
===============================
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: The default value of numeric_only in DataFrame.mean is 
deprecated. In a future version, it will default to False. In addition, 
specifying 'numeric_only=None' is deprecated. Select only valid columns or 
specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/pytest_postCommitIT-df-py311.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/examples/cookbook/bigtableio_it_test.py::BigtableIOWriteTest::test_bigtable_write
 - Failed: Timeout >4500.0s
====== 1 failed, 86 passed, 51 skipped, 
17 warnings in 8067.12s (2:14:27) ======

> Task :sdks:python:test-suites:dataflow:py311:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 139

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py311:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.4/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 2h 22m 52s
214 actionable tasks: 153 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/55jnm4qkqeoly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to