See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/695/display/redirect>

Changes:


------------------------------------------
[...truncated 333.63 KB...]
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-09-17_14_16_51-11821572761830337918]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-09-17_14_16_51-11821572761830337918
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_16_51-11821572761830337918?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_16_51-11821572761830337918?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_16_51-11821572761830337918?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-09-17_14_16_51-11821572761830337918 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:16:55.202Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:16:57.253Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3736>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:16:57.311Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:17:08.768Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:20:15.455Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:20:20.429Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3736>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:20:20.587Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:22:25.772Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-09-17_14_16_51-11821572761830337918 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1694985394-8fac5c]
PASSED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1694985752-4adc52]
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:399 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f35610708b0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f35610710d0> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212239-590859-q2cb7r68.1694985759.591010/beam-sdks-java-io-google-cloud-platform-expansion-service-2.51.0-SNAPSHOT-B11e7Uv-3rQYUhNWUgGOe40xu1BVYbGsM5CR04QMDAA.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212239-590859-q2cb7r68.1694985759.591010/beam-sdks-java-io-google-cloud-platform-expansion-service-2.51.0-SNAPSHOT-B11e7Uv-3rQYUhNWUgGOe40xu1BVYbGsM5CR04QMDAA.jar
 in 5 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212239-590859-q2cb7r68.1694985759.591010/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212239-590859-q2cb7r68.1694985759.591010/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212239-590859-q2cb7r68.1694985759.591010/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212239-590859-q2cb7r68.1694985759.591010/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20230917212239591802-4538'
 createTime: '2023-09-17T21:22:46.724645Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-09-17_14_22_46-16985846457091155370'
 location: 'us-central1'
 name: 'beamapp-jenkins-0917212239-590859-q2cb7r68'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-09-17T21:22:46.724645Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-09-17_14_22_46-16985846457091155370]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-09-17_14_22_46-16985846457091155370
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_22_46-16985846457091155370?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_22_46-16985846457091155370?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_22_46-16985846457091155370?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-09-17_14_22_46-16985846457091155370 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:22:49.842Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:22:51.413Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3736>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:22:51.470Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:23:05.054Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:26:29.383Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:26:34.128Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3736>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:26:34.338Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:28:39.242Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-09-17_14_22_46-16985846457091155370 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1694985752-4adc52]
PASSED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1694986132-3fe8c8]
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:399 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f35610708b0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f35610710d0> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212900-387437-s8f6pddc.1694986140.387588/beam-sdks-java-io-google-cloud-platform-expansion-service-2.51.0-SNAPSHOT-B11e7Uv-3rQYUhNWUgGOe40xu1BVYbGsM5CR04QMDAA.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212900-387437-s8f6pddc.1694986140.387588/beam-sdks-java-io-google-cloud-platform-expansion-service-2.51.0-SNAPSHOT-B11e7Uv-3rQYUhNWUgGOe40xu1BVYbGsM5CR04QMDAA.jar
 in 5 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212900-387437-s8f6pddc.1694986140.387588/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212900-387437-s8f6pddc.1694986140.387588/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212900-387437-s8f6pddc.1694986140.387588/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0917212900-387437-s8f6pddc.1694986140.387588/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20230917212900388407-7802'
 createTime: '2023-09-17T21:29:07.676851Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-09-17_14_29_06-13272129286542065041'
 location: 'us-central1'
 name: 'beamapp-jenkins-0917212900-387437-s8f6pddc'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-09-17T21:29:07.676851Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-09-17_14_29_06-13272129286542065041]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-09-17_14_29_06-13272129286542065041
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_29_06-13272129286542065041?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_29_06-13272129286542065041?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-09-17_14_29_06-13272129286542065041?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-09-17_14_29_06-13272129286542065041 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:29:10.754Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:29:13.120Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:29:13.176Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:29:13.807Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:29:22.331Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3736>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:29:43.622Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:09.855Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:10.184Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3736>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:10.249Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:11.717Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:11.803Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:16.160Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:32:16.339Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-09-17T21:34:21.037Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 
2023-09-17_14_29_06-13272129286542065041 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1694986132-3fe8c8]
PASSED
------------------------------ live log teardown 
-------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:198 Deleting 
instance [bt-write-xlang-1694984646-9cb97a]


=============================== warnings summary 
===============================
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/hdfs/config.py:28
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/hdfs/config.py>:28:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py:18
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py>:18:
 DeprecationWarning: pkg_resources is deprecated as an API. See 
https://setuptools.pypa.io/en/latest/pkg_resources.html
    import pkg_resources

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871:
 19 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871:
 16 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.cloud')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2350:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(parent)

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.logging')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.iam')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py:20
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py>:20:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.rpc')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    pkg_resources.declare_namespace(__name__)

apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/examples/snippets/snippets_test.py>:767:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location = 'gs://mylocation'

apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2102:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2108:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/snippets/snippets_test.py: 2 warnings
apache_beam/io/external/xlang_bigqueryio_it_test.py: 10 warnings
apache_beam/io/gcp/bigtableio_it_test.py: 6 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/transforms/external.py>:717:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    self._expansion_service, pipeline.options)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
== 14 passed, 17 skipped, 7173 
deselected, 79 warnings in 5149.56s (1:25:49) 
===

> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguageCleanup
Stopping expansion service pid: 1902764.
Skipping invalid pid: 1902765.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 1896744

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py311:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 1h 36m 6s
114 actionable tasks: 78 executed, 32 from cache, 4 up-to-date

Publishing build scan...
Publishing build scan failed due to network error 
'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
Publishing build scan failed due to network error 
'java.net.SocketTimeoutException: Read timed out' (1 retry remaining)...

A network error occurred.

If you require assistance with this problem, please report it to your Gradle 
Enterprise administrator and include the following information via copy/paste.

----------
Gradle version: 7.6.2
Plugin version: 3.13.2
Request URL: https://ge.apache.org/scans/publish/gradle/3.13.2/upload
Request ID: e8a14224-b938-4f4a-9261-04511cd9071e
Exception: java.net.SocketTimeoutException: Read timed out
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to