See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/510/display/redirect?page=changes>
Changes: [noreply] Bump github.com/fsouza/fake-gcs-server from 1.47.0 to 1.47.2 in /sdks ------------------------------------------ [...truncated 598.26 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:27:59.118Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:28:21.547Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:28:44.969Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:30:44.764Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:30:58.201Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:31:01.705Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3736>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:31:01.748Z: JOB_MESSAGE_DEBUG: Executing success step success1 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:31:01.795Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:31:01.833Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:31:01.855Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:07.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:07.237Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:07.268Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:155 Job 2023-08-02_08_27_53-7211226018045026513 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:188 Deleting table [test-table-1690990059-eebe9d] [32mPASSED[0m apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:184 Created table [test-table-1690990394-10c903] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.50.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:388 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:295 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest [32mINFO [0m root:environments.py:302 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function pack_combiners at 0x7f51954f5dc0> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function sort_stages at 0x7f51957505e0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:404 Defaulting to the temp_location as staging_location: gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/beam-sdks-java-io-google-cloud-platform-expansion-service-2.50.0-SNAPSHOT-SCrrg1403lU3QUDO9pZlmlRV3_fhZsdDd_PpPqdbz4c.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/beam-sdks-java-io-google-cloud-platform-expansion-service-2.50.0-SNAPSHOT-SCrrg1403lU3QUDO9pZlmlRV3_fhZsdDd_PpPqdbz4c.jar in 5 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/apache_beam-2.50.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/apache_beam-2.50.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0802153320-714507-vkmiwwtn.1690990400.714967/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20230802153320715862-1138' createTime: '2023-08-02T15:33:28.725986Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-08-02_08_33_28-10629615339438754731' location: 'us-central1' name: 'beamapp-jenkins-0802153320-714507-vkmiwwtn' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-08-02T15:33:28.725986Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-08-02_08_33_28-10629615339438754731] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-08-02_08_33_28-10629615339438754731 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-02_08_33_28-10629615339438754731?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-02_08_33_28-10629615339438754731?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-02_08_33_28-10629615339438754731?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:155 Job 2023-08-02_08_33_28-10629615339438754731 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:29.476Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2023-08-02_08_33_28-10629615339438754731. The number of workers will be between 1 and 1000. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:29.521Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2023-08-02_08_33_28-10629615339438754731. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:31.845Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.147Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.169Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.211Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.235Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.258Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.280Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.308Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.326Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:3736>) into Create/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:3736>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.359Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.377Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.397Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.437Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.459Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.477Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.498Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow) into Create/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.522Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous) into WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.539Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) into WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.569Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.590Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.610Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.630Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.730Z: JOB_MESSAGE_DEBUG: Executing wait step start13 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.772Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.808Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:33.826Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:34.111Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:42.938Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:33:42.977Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3736>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:34:04.342Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:34:38.837Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:36:34.063Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:36:50.582Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:36:50.946Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3736>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:36:50.987Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:36:51.896Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:36:51.935Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:37:19.141Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:37:19.185Z: JOB_MESSAGE_DEBUG: Executing success step success11 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:37:19.229Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:37:19.270Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:37:19.289Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:39:40.416Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:39:40.454Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 2023-08-02T15:39:40.475Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:155 Job 2023-08-02_08_33_28-10629615339438754731 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:188 Deleting table [test-table-1690990394-10c903] [32mPASSED[0m [1m------------------------------ live log teardown -------------------------------[0m [32mINFO [0m apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:196 Deleting instance [bt-write-xlang-1690988962-7ef895] [33m=============================== warnings summary ===============================[0m ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py:18 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py>:18: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html import pkg_resources ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: 20 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: 16 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(parent) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.iam')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py:20 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py>:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages pkg_resources.declare_namespace(__name__) apache_beam/typehints/pandas_type_compatibility_test.py:67 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. }).set_index(pd.Int64Index(range(123, 223), name='an_index')), apache_beam/typehints/pandas_type_compatibility_test.py:90 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. pd.Int64Index(range(123, 223), name='an_index'), apache_beam/typehints/pandas_type_compatibility_test.py:91 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. pd.Int64Index(range(475, 575), name='another_index'), apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/examples/snippets/snippets_test.py>:767: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location = 'gs://mylocation' apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2101: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang apache_beam/examples/snippets/snippets_test.py::SnippetsTest::test_model_bigqueryio_xlang apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2107: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/snippets/snippets_test.py: 2 warnings apache_beam/io/external/xlang_bigqueryio_it_test.py: 10 warnings apache_beam/io/gcp/bigtableio_it_test.py: 6 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/transforms/external.py>:676: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self._expansion_service, pipeline.options) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml> - [33m== [32m14 passed[0m, [33m[1m14 skipped[0m, [33m[1m6937 deselected[0m, [33m[1m83 warnings[0m[33m in 5375.10s (1:29:35)[0m[33m ===[0m > Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguageCleanup Stopping expansion service pid: 804629. Skipping invalid pid: 804630. > Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup Killing process at 798157 FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build.gradle'> line: 96 * What went wrong: Execution failed for task ':sdks:python:bdistPy311linux'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 1h 41m 33s 115 actionable tasks: 79 executed, 32 from cache, 4 up-to-date Publishing build scan... https://ge.apache.org/s/co7f4kdothqm4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
