See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/811/display/redirect>

Changes:


------------------------------------------
[...truncated 325.25 KB...]
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:56:37.208Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:56:39.187Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:56:39.249Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:56:48.521Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:59:46.647Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:59:50.844Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T20:59:51.034Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:06:49.611Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_13_56_33-14978668993266418114 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697489769-55cb6f]
PASSED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_cells_with_timerange_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1697490419-f4b8f6]
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7fbee76491c0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7fbee7649a80> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016210705-004021-1nnf7i0m.1697490425.004199/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016210705-004021-1nnf7i0m.1697490425.004199/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar
 in 7 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016210705-004021-1nnf7i0m.1697490425.004199/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016210705-004021-1nnf7i0m.1697490425.004199/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016210705-004021-1nnf7i0m.1697490425.004199/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016210705-004021-1nnf7i0m.1697490425.004199/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231016210705004901-1588'
 createTime: '2023-10-16T21:07:16.595532Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-16_14_07_13-3930331599292015782'
 location: 'us-central1'
 name: 'beamapp-jenkins-1016210705-004021-1nnf7i0m'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-16T21:07:16.595532Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-16_14_07_13-3930331599292015782]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-16_14_07_13-3930331599292015782
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_07_13-3930331599292015782?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_07_13-3930331599292015782?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_07_13-3930331599292015782?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_07_13-3930331599292015782 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:07:19.521Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:07:21.776Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:07:21.834Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:07:41.683Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:12:06.618Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:12:28.974Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:12:29.156Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:14:33.904Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_07_13-3930331599292015782 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697490419-f4b8f6]
PASSED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_column_family_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1697490881-c231f6]
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7fbee76491c0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7fbee7649a80> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016211449-733624-z3mbqqd2.1697490889.733766/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016211449-733624-z3mbqqd2.1697490889.733766/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar
 in 6 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016211449-733624-z3mbqqd2.1697490889.733766/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016211449-733624-z3mbqqd2.1697490889.733766/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016211449-733624-z3mbqqd2.1697490889.733766/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016211449-733624-z3mbqqd2.1697490889.733766/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231016211449734448-1331'
 createTime: '2023-10-16T21:14:58.772949Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-16_14_14_57-9554414908278725061'
 location: 'us-central1'
 name: 'beamapp-jenkins-1016211449-733624-z3mbqqd2'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-16T21:14:58.772949Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-16_14_14_57-9554414908278725061]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-16_14_14_57-9554414908278725061
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_14_57-9554414908278725061?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_14_57-9554414908278725061?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_14_57-9554414908278725061?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_14_57-9554414908278725061 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:15:01.796Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:15:03.652Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:15:03.710Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:15:23.796Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:18:19.462Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:18:41.928Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:18:42.115Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:20:47.780Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_14_57-9554414908278725061 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697490881-c231f6]
PASSED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1697491254-9d6f54]
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7fbee76491c0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7fbee7649a80> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212059-929238-n1r5vqwm.1697491259.929398/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212059-929238-n1r5vqwm.1697491259.929398/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar
 in 6 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212059-929238-n1r5vqwm.1697491259.929398/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212059-929238-n1r5vqwm.1697491259.929398/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212059-929238-n1r5vqwm.1697491259.929398/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212059-929238-n1r5vqwm.1697491259.929398/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231016212059930112-3036'
 createTime: '2023-10-16T21:21:08.464761Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-16_14_21_07-10625663602599016003'
 location: 'us-central1'
 name: 'beamapp-jenkins-1016212059-929238-n1r5vqwm'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-16T21:21:08.464761Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-16_14_21_07-10625663602599016003]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-16_14_21_07-10625663602599016003
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_21_07-10625663602599016003?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_21_07-10625663602599016003?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_21_07-10625663602599016003?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_21_07-10625663602599016003 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:21:11.508Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:21:13.674Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:21:15.888Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:21:27.922Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:24:33.119Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:24:36.716Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:24:36.948Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:26:58.716Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_21_07-10625663602599016003 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697491254-9d6f54]
PASSED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1697491628-5d7620]
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.11_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7fbee76491c0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7fbee7649a80> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212724-174776-k089e49b.1697491644.174990/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212724-174776-k089e49b.1697491644.174990/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-wjNog2Id_Hf6BQcHUAQthOsLwdTT95wrBwjKMdWXNnE.jar
 in 5 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212724-174776-k089e49b.1697491644.174990/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212724-174776-k089e49b.1697491644.174990/apache_beam-2.52.0.dev0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212724-174776-k089e49b.1697491644.174990/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1016212724-174776-k089e49b.1697491644.174990/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231016212724175983-7422'
 createTime: '2023-10-16T21:27:34.125370Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-16_14_27_30-13963192753125221370'
 location: 'us-central1'
 name: 'beamapp-jenkins-1016212724-174776-k089e49b'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-16T21:27:34.125370Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-16_14_27_30-13963192753125221370]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-16_14_27_30-13963192753125221370
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_27_30-13963192753125221370?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_27_30-13963192753125221370?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-16_14_27_30-13963192753125221370?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_27_30-13963192753125221370 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:27:37.287Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:27:39.880Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:27:39.933Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:27:40.480Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:27:49.085Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:28:13.292Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:31:42.417Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:31:43.793Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:31:43.840Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:31:44.307Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:31:44.352Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:32:07.129Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:32:07.291Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-16T21:34:12.343Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-16_14_27_30-13963192753125221370 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697491628-5d7620]
PASSED
------------------------------ live log teardown 
-------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:198 Deleting 
instance [bt-write-xlang-1697489763-6cae40]


=============================== warnings summary 
===============================
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=== 13 passed, 19 skipped, 7190 
deselected, 1 warning in 5129.69s (1:25:29) 
====

> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 723592.
Skipping invalid pid: 723593.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 720363

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build.gradle'>
 line: 97

* What went wrong:
Execution failed for task ':sdks:python:bdistPy38linux'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 1h 36m 3s
120 actionable tasks: 85 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/ka7gfidyiaotq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to