See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/828/display/redirect>
Changes: ------------------------------------------ [...truncated 464.90 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T02:55:33.950Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T02:55:41.210Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T02:58:30.940Z: JOB_MESSAGE_BASIC: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T02:58:42.205Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T02:58:42.371Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:01:08.857Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_19_55_28-11897525564990056298 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table [test-table-1697856912-0dbf12] [32mPASSED[0m apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table [test-table-1697857279-a9ffa5] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:313 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest [32mINFO [0m root:environments.py:320 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function pack_combiners at 0x7f1271045120> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function sort_stages at 0x7f12710459e0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030126-980478-1nnf7i0m.1697857286.980632/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030126-980478-1nnf7i0m.1697857286.980632/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar in 8 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030126-980478-1nnf7i0m.1697857286.980632/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030126-980478-1nnf7i0m.1697857286.980632/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030126-980478-1nnf7i0m.1697857286.980632/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030126-980478-1nnf7i0m.1697857286.980632/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20231021030126981360-1588' createTime: '2023-10-21T03:01:37.019970Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-10-20_20_01_36-13772743150189430663' location: 'us-central1' name: 'beamapp-jenkins-1021030126-980478-1nnf7i0m' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-10-21T03:01:37.019970Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-10-20_20_01_36-13772743150189430663] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-10-20_20_01_36-13772743150189430663 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_01_36-13772743150189430663?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_01_36-13772743150189430663?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_01_36-13772743150189430663?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_01_36-13772743150189430663 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:01:39.969Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:01:43.058Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:01:43.197Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:01:50.143Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:04:51.815Z: JOB_MESSAGE_BASIC: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:05:02.309Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:05:02.489Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:07:08.339Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_01_36-13772743150189430663 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table [test-table-1697857279-a9ffa5] [32mPASSED[0m apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_column_family_mutation [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table [test-table-1697857637-d5dda5] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:313 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest [32mINFO [0m root:environments.py:320 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function pack_combiners at 0x7f1271045120> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function sort_stages at 0x7f12710459e0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030726-585671-z3mbqqd2.1697857646.585821/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030726-585671-z3mbqqd2.1697857646.585821/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar in 6 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030726-585671-z3mbqqd2.1697857646.585821/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030726-585671-z3mbqqd2.1697857646.585821/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 1 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030726-585671-z3mbqqd2.1697857646.585821/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021030726-585671-z3mbqqd2.1697857646.585821/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20231021030726586569-1331' createTime: '2023-10-21T03:07:34.612182Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-10-20_20_07_34-9905846582743998394' location: 'us-central1' name: 'beamapp-jenkins-1021030726-585671-z3mbqqd2' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-10-21T03:07:34.612182Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-10-20_20_07_34-9905846582743998394] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-10-20_20_07_34-9905846582743998394 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_07_34-9905846582743998394?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_07_34-9905846582743998394?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_07_34-9905846582743998394?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_07_34-9905846582743998394 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:07:37.811Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:07:40.246Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:07:40.305Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:08:05.999Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:10:41.587Z: JOB_MESSAGE_BASIC: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:12:10.853Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:12:11.888Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:12:18.226Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:15:11.894Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:15:12.415Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:17:17.275Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_07_34-9905846582743998394 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table [test-table-1697857637-d5dda5] [32mPASSED[0m apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table [test-table-1697858254-44b2af] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:313 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest [32mINFO [0m root:environments.py:320 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function pack_combiners at 0x7f1271045120> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function sort_stages at 0x7f12710459e0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021031741-200480-n1r5vqwm.1697858261.200628/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021031741-200480-n1r5vqwm.1697858261.200628/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar in 6 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021031741-200480-n1r5vqwm.1697858261.200628/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021031741-200480-n1r5vqwm.1697858261.200628/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021031741-200480-n1r5vqwm.1697858261.200628/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021031741-200480-n1r5vqwm.1697858261.200628/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20231021031741201335-3036' createTime: '2023-10-21T03:17:49.391940Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-10-20_20_17_48-993476348333204943' location: 'us-central1' name: 'beamapp-jenkins-1021031741-200480-n1r5vqwm' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-10-21T03:17:49.391940Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-10-20_20_17_48-993476348333204943] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-10-20_20_17_48-993476348333204943 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_17_48-993476348333204943?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_17_48-993476348333204943?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_17_48-993476348333204943?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_17_48-993476348333204943 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:17:52.752Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:17:54.845Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:17:54.911Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:18:22.829Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:20:56.325Z: JOB_MESSAGE_BASIC: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:21:06.941Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:21:07.125Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:25:45.779Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:25:48.588Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:25:49.366Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_17_48-993476348333204943 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table [test-table-1697858254-44b2af] [32mPASSED[0m apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table [test-table-1697858789-ef85db] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:313 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest [32mINFO [0m root:environments.py:320 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function pack_combiners at 0x7f1271045120> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 ==================== <function sort_stages at 0x7f12710459e0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021032636-205598-k089e49b.1697858796.205750/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021032636-205598-k089e49b.1697858796.205750/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-y4Obo1uagHUEUYL7F_Uj_5F4OkFLAGNwMWZ5NKmzneQ.jar in 6 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021032636-205598-k089e49b.1697858796.205750/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021032636-205598-k089e49b.1697858796.205750/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021032636-205598-k089e49b.1697858796.205750/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1021032636-205598-k089e49b.1697858796.205750/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20231021032636206502-7422' createTime: '2023-10-21T03:26:44.236690Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-10-20_20_26_43-4637293118424554160' location: 'us-central1' name: 'beamapp-jenkins-1021032636-205598-k089e49b' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-10-21T03:26:44.236690Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-10-20_20_26_43-4637293118424554160] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-10-20_20_26_43-4637293118424554160 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_26_43-4637293118424554160?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_26_43-4637293118424554160?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-20_20_26_43-4637293118424554160?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_26_43-4637293118424554160 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:26:48.462Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:26:50.632Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:26:50.680Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:26:51.220Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:26:59.829Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:27:13.074Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:44.779Z: JOB_MESSAGE_BASIC: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:45.074Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3759>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:45.117Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:46.628Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:46.678Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:50.543Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:29:50.912Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-10-21T03:32:12.282Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 2023-10-20_20_26_43-4637293118424554160 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table [test-table-1697858789-ef85db] [32mPASSED[0m [1m------------------------------ live log teardown -------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:198 Deleting instance [bt-write-xlang-1697856908-28220c] [33m=============================== warnings summary ===============================[0m ../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives from distutils import util -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml> - [33m=== [32m13 passed[0m, [33m[1m19 skipped[0m, [33m[1m7191 deselected[0m, [33m[1m1 warning[0m[33m in 5031.24s (1:23:51)[0m[33m ====[0m > Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup Stopping expansion service pid: 3134974. Skipping invalid pid: 3134975. > Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup Killing process at 3134322 FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py38:installGcpTest'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Get more help at https://help.gradle.org. Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. For more on this, please refer to https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings in the Gradle documentation. BUILD FAILED in 1h 34m 120 actionable tasks: 85 executed, 33 from cache, 2 up-to-date Publishing build scan... https://ge.apache.org/s/vepn3eimji4hg Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
