See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/147/display/redirect?page=changes>

Changes:

[noreply] [Tour of Beam] [Frontend] Markdown selection fix (#26496)


------------------------------------------
[...truncated 381.91 KB...]
INFO     apache_beam.runners.portability.stager:stager.py:330 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:453 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:376 Default Python SDK image for 
environment is apache/beam_python3.8_sdk:2.48.0.dev
INFO     root:environments.py:295 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422
INFO     root:environments.py:302 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230422" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 
==================== <function pack_combiners at 0x7ff5cff45550> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 
==================== <function sort_stages at 0x7ff5cff45d30> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:466 Defaulting to 
the temp_location as staging_location: 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/icedtea-sound-z_dvIuvU8ZJnx9HLpkjsVJ0kTR-os7D9h5_Nx5_eLfw.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/icedtea-sound-z_dvIuvU8ZJnx9HLpkjsVJ0kTR-os7D9h5_Nx5_eLfw.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/jaccess-kuSyBwlgDsM1YE_9LwCkEOL6tkTzJnNBCQLIiRHF-fA.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/jaccess-kuSyBwlgDsM1YE_9LwCkEOL6tkTzJnNBCQLIiRHF-fA.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/localedata-Z2uTZhqFCBVf49D9vip6L4CY0Wv7maMX7YRS3EyoZMs.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/localedata-Z2uTZhqFCBVf49D9vip6L4CY0Wv7maMX7YRS3EyoZMs.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/nashorn-ZL-RSHx2intMJnY-DlHLkl4aTpBA1nxpeb5V08uwIuY.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/nashorn-ZL-RSHx2intMJnY-DlHLkl4aTpBA1nxpeb5V08uwIuY.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/cldrdata-92KZgYL5H6sA_g9Uh_eHsKuDAAuiqVYK6dogst4-BgQ.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/cldrdata-92KZgYL5H6sA_g9Uh_eHsKuDAAuiqVYK6dogst4-BgQ.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/dnsns-xElh83ynN6pQ_xgB81ysi9eP9tAQ8HKBfTijyCpm-e0.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/dnsns-xElh83ynN6pQ_xgB81ysi9eP9tAQ8HKBfTijyCpm-e0.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/beam-sdks-java-io-google-cloud-platform-expansion-service-2.48.0-SNAPSHOT-4PytdLn_tNDQuaXiRWLDSRYWkLD-mJQzT2iu24xbXw0.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/beam-sdks-java-io-google-cloud-platform-expansion-service-2.48.0-SNAPSHOT-4PytdLn_tNDQuaXiRWLDSRYWkLD-mJQzT2iu24xbXw0.jar
 in 5 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/dataflow_python_sdk.tar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/dataflow_python_sdk.tar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0503144519-393940-myffu39d.1683125119.409342/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:909 Create job: 
<Job
 clientRequestId: '20230503144519410817-2387'
 createTime: '2023-05-03T14:45:28.388993Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-03_07_45_27-1352901158459474624'
 location: 'us-central1'
 name: 'beamapp-jenkins-0503144519-393940-myffu39d'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-03T14:45:28.388993Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job 
with id: [2023-05-03_07_45_27-1352901158459474624]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 
2023-05-03_07_45_27-1352901158459474624
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-03_07_45_27-1352901158459474624?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-03_07_45_27-1352901158459474624?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-03_07_45_27-1352901158459474624?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-05-03_07_45_27-1352901158459474624 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:29.192Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2023-05-03_07_45_27-1352901158459474624. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:29.309Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2023-05-03_07_45_27-1352901158459474624.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:32.542Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-c.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.674Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.709Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.771Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.809Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey:
 GroupByKey not followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.859Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.888Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.922Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.950Z: JOB_MESSAGE_DETAILED: Created new flatten 
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10
 to unzip producers of 
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion35
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:33.980Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.012Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2178>) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.045Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2174>) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.079Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32
 for input 
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.112Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous), through flatten 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.145Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10
 for input 
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.178Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous), through flatten 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.209Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.241Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.273Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/FlatMap(<lambda at core.py:3574>) into Create/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.305Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/Map(decode) into Create/FlatMap(<lambda at core.py:3574>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.337Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2158>) into Create/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.369Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
 into WriteToBigQuery/Map(<lambda at bigquery.py:2158>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.412Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.467Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.500Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.532Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.565Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.596Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.620Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.652Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.672Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.698Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.723Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.755Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.787Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.820Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.878Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.907Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.940Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:34.972Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:35.125Z: JOB_MESSAGE_DEBUG: Executing wait step start28
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:35.191Z: JOB_MESSAGE_BASIC: Executing operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:35.245Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:35.284Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-c...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:35.928Z: JOB_MESSAGE_BASIC: Finished operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:35.990Z: JOB_MESSAGE_DEBUG: Value 
"WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:36.055Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at 
bigquery.py:2158>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:45:48.902Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:46:14.797Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:49:09.540Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:50:51.261Z: JOB_MESSAGE_DETAILED: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:03.668Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at 
bigquery.py:2158>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:03.763Z: JOB_MESSAGE_BASIC: Executing operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:04.928Z: JOB_MESSAGE_BASIC: Finished operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:05.001Z: JOB_MESSAGE_BASIC: Executing operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:06.098Z: JOB_MESSAGE_BASIC: Finished operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:06.175Z: JOB_MESSAGE_DEBUG: Executing success step success26
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:06.279Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:06.353Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:51:06.378Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:53:15.912Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:53:15.957Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-05-03T14:53:15.985Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-05-03_07_45_27-1352901158459474624 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to 
perform query SELECT * FROM 
python_xlang_storage_write16831251069df143.python_storage_write_nested_records_and_lists
 to BQ
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of 
query is: [([1, 2, 3], {'nested_int': 1, 'nested_str': 'a'}, 
[{'nested_numeric': Decimal('1.23'), 'nested_bytes': b'a'}, {'nested_numeric': 
Decimal('3.21'), 'nested_bytes': b'aa'}])]
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:118
 Deleting dataset python_xlang_storage_write16831251069df143 in project 
apache-beam-testing
PASSED

=============================== warnings summary 
===============================
../../build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:121
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:121:
 DeprecationWarning: pkg_resources is deprecated as an API
    warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
 18 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
 13 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.cloud')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2349:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(parent)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.logging')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.iam')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py:20
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py>:20:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.rpc')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    pkg_resources.declare_namespace(__name__)

apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2029:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2035:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=== 3 passed, 9 skipped, 6906 
deselected, 47 warnings in 1511.61s (0:25:11) 
====

> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 2311613.
Skipping invalid pid: 2311614.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/apache_beam/runners/portability/local_job_service_main.py";,>
 line 170, in <module>
    run(sys.argv)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/apache_beam/runners/portability/local_job_service_main.py";,>
 line 89, in run
    os.unlink(options.pid_file)
FileNotFoundError: [Errno 2] No such file or directory: 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/local_job_service_main-34831.pid'>
Killing process at 2311665

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:xlang:fnApiJobServerCleanup'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 27m 21s
103 actionable tasks: 13 executed, 90 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: 
com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
        at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
        at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
        at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
        at java.lang.reflect.WeakCache.get(WeakCache.java:127)
        at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
        at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
        at 
com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
        at 
com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
        at 
com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/jytuzgxukznry

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to