See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/342/display/redirect?page=changes>

Changes:

[noreply] exception handling for loading models (#27186)


------------------------------------------
[...truncated 344.71 KB...]
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:43:49.637Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2157>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2177>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2173>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteInconsistent/Write
 
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2173>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at bigquery.py:2177>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:43:49.703Z: JOB_MESSAGE_DEBUG: Executing success step success26
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:43:49.770Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:43:49.825Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:43:49.861Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:16.515Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:16.551Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:16.582Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-06-21_07_40_26-3598399833028313282 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to 
perform query SELECT * FROM 
python_xlang_storage_write_1687358414_4d6d21.with_at_least_once_semantics to BQ
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of 
query is: [(4, 0.4, Decimal('4.44'), 'd', False, b'd', datetime.datetime(1970, 
1, 1, 1, 6, 40, 400, tzinfo=datetime.timezone.utc)), (1, 0.1, Decimal('1.11'), 
'a', True, b'a', datetime.datetime(1970, 1, 1, 0, 16, 40, 100, 
tzinfo=datetime.timezone.utc)), (2, 0.2, Decimal('2.22'), 'b', False, b'b', 
datetime.datetime(1970, 1, 1, 0, 33, 20, 200, tzinfo=datetime.timezone.utc)), 
(3, 0.3, Decimal('3.33'), 'c', True, b'd', datetime.datetime(1970, 1, 1, 0, 50, 
0, 300, tzinfo=datetime.timezone.utc))]
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:121
 Deleting dataset python_xlang_storage_write_1687358414_4d6d21 in project 
apache-beam-testing
PASSED
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_write_with_beam_rows
 
-------------------------------- live log call 
---------------------------------
INFO     apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:815 
Dataset apache-beam-testing:python_xlang_storage_write_1687358785_b4433c does 
not exist so we will create it as temporary with location=None
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:113
 Created dataset python_xlang_storage_write_1687358785_b4433c in project 
apache-beam-testing
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:115
 expansion port: 40825
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:407 
Automatically enabling Dataflow Runner V2 since the pipeline used 
cross-language transforms.
INFO     apache_beam.runners.portability.stager:stager.py:330 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.49.0.dev0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:454 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:296 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.7_sdk:latest
INFO     root:environments.py:304 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.7_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 
==================== <function pack_combiners at 0x7f225a1fd950> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 
==================== <function sort_stages at 0x7f225a1ff170> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:468 Defaulting to 
the temp_location as staging_location: 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/icedtea-sound-Sx4pTbVkRyMN68Iwm-IRjHx-_IrVB7OJ04Hgf0GOshw.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/icedtea-sound-Sx4pTbVkRyMN68Iwm-IRjHx-_IrVB7OJ04Hgf0GOshw.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/jaccess-cLw_6VKUXVSSR02qNdmHfiCEQC4xgO9cGkcXeJGoXLU.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/jaccess-cLw_6VKUXVSSR02qNdmHfiCEQC4xgO9cGkcXeJGoXLU.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/localedata-m64AoS_ttn_e9NIqwSVR0lu2Z1Z5ql2mff-rP_lcNFc.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/localedata-m64AoS_ttn_e9NIqwSVR0lu2Z1Z5ql2mff-rP_lcNFc.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/nashorn-N0Jsn4-u6InD0KSL1iQKJxisnjbN1yx2Il6jtukpLvI.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/nashorn-N0Jsn4-u6InD0KSL1iQKJxisnjbN1yx2Il6jtukpLvI.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/cldrdata-5EWPOR1vxHHNVt2c5NwrXn6onAk_PiwkFY34pXx-PXY.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/cldrdata-5EWPOR1vxHHNVt2c5NwrXn6onAk_PiwkFY34pXx-PXY.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/dnsns-WBg9XKwp_aXals1SHuDLDILJwBF0Qp8_Vs6ch4cmKVc.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/dnsns-WBg9XKwp_aXals1SHuDLDILJwBF0Qp8_Vs6ch4cmKVc.jar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/beam-sdks-java-io-google-cloud-platform-expansion-service-2.49.0-SNAPSHOT-FHSNUQtu6DVWd_O7kTrzUkYmNajKmOP56GBpxum230Y.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/beam-sdks-java-io-google-cloud-platform-expansion-service-2.49.0-SNAPSHOT-FHSNUQtu6DVWd_O7kTrzUkYmNajKmOP56GBpxum230Y.jar
 in 7 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/apache_beam-2.49.0.dev0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/apache_beam-2.49.0.dev0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 4 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0621144629-148246-cp1k6p94.1687358789.148645/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:909 Create job: 
<Job
 clientRequestId: '20230621144629149557-7366'
 createTime: '2023-06-21T14:46:43.201779Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-21_07_46_42-2926018702662580941'
 location: 'us-central1'
 name: 'beamapp-jenkins-0621144629-148246-cp1k6p94'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-21T14:46:43.201779Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job 
with id: [2023-06-21_07_46_42-2926018702662580941]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 
2023-06-21_07_46_42-2926018702662580941
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:918 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-21_07_46_42-2926018702662580941?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-21_07_46_42-2926018702662580941?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-21_07_46_42-2926018702662580941?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-06-21_07_46_42-2926018702662580941 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:43.888Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2023-06-21_07_46_42-2926018702662580941. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:44.009Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2023-06-21_07_46_42-2926018702662580941.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:46.002Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-f.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.116Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.140Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.182Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.205Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey:
 GroupByKey not followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.249Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.275Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.293Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.327Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.351Z: JOB_MESSAGE_DETAILED: Created new flatten 
external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat29-c19
 to unzip producers of 
external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat32
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.372Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.392Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat29
 for input 
external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat7.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.415Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous), through flatten 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.438Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat29-c19
 for input 
external_10StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion-payload-schemat7.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.461Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous), through flatten 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.484Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous) into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.507Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous) into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.532Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/FlatMap(<lambda at core.py:3634>) into Create/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.552Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at 
core.py:3634>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.576Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.600Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.621Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.644Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.667Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) 
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.688Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.710Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.733Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
 into Create/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.754Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.779Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.800Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.822Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.846Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.871Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.890Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.915Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.942Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.961Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:47.984Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.005Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.028Z: JOB_MESSAGE_DETAILED: Fusing consumer 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites) into 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.057Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.084Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.107Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.130Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.253Z: JOB_MESSAGE_DEBUG: Executing wait step start34
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.297Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.332Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.379Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-f...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:48.831Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:51.467Z: JOB_MESSAGE_DEBUG: Value 
"Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" 
materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:46:51.514Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3634>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:47:17.576Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:47:31.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:49:43.715Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:17.268Z: JOB_MESSAGE_DETAILED: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:17.585Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3634>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:17.649Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:18.481Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:18.554Z: JOB_MESSAGE_BASIC: Executing operation 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:18.693Z: JOB_MESSAGE_BASIC: Finished operation 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:18.749Z: JOB_MESSAGE_DEBUG: Value 
"StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:18.816Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and 
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 
Records+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and 
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:25.128Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and 
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 
Records+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and 
errors/Map/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:25.223Z: JOB_MESSAGE_BASIC: Executing operation 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:26.764Z: JOB_MESSAGE_BASIC: Finished operation 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:26.902Z: JOB_MESSAGE_BASIC: Executing operation 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:27.937Z: JOB_MESSAGE_BASIC: Finished operation 
StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:28Z: JOB_MESSAGE_DEBUG: Executing success step success32
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:28.060Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:28.131Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:50:28.155Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:52:38.108Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:52:38.141Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-06-21T14:52:38.166Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-06-21_07_46_42-2926018702662580941 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to 
perform query SELECT * FROM 
python_xlang_storage_write_1687358785_b4433c.write_with_beam_rows to BQ
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of 
query is: [(1, 0.1, Decimal('1.11'), 'a', True, b'a', datetime.datetime(1970, 
1, 1, 0, 16, 40, 100, tzinfo=datetime.timezone.utc)), (3, 0.3, Decimal('3.33'), 
'c', True, b'd', datetime.datetime(1970, 1, 1, 0, 50, 0, 300, 
tzinfo=datetime.timezone.utc)), (4, 0.4, Decimal('4.44'), 'd', False, b'd', 
datetime.datetime(1970, 1, 1, 1, 6, 40, 400, tzinfo=datetime.timezone.utc)), 
(2, 0.2, Decimal('2.22'), 'b', False, b'b', datetime.datetime(1970, 1, 1, 0, 
33, 20, 200, tzinfo=datetime.timezone.utc))]
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:121
 Deleting dataset python_xlang_storage_write_1687358785_b4433c in project 
apache-beam-testing
PASSED

=============================== warnings summary 
===============================
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:18
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py>:18:
 DeprecationWarning: pkg_resources is deprecated as an API. See 
https://setuptools.pypa.io/en/latest/pkg_resources.html
    import pkg_resources

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2871:
 18 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2871:
 13 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.cloud')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2350
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2350
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2350:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(parent)

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2871
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.logging')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2871
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2871:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.iam')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py>:20:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.rpc')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    pkg_resources.declare_namespace(__name__)

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2028:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_nested_records_and_lists
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_at_least_once
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_streaming_with_auto_sharding
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_with_at_least_once_semantics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2034:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=== 7 passed, 9 skipped, 6948 
deselected, 52 warnings in 2859.79s (0:47:39) 
====

> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguageCleanup
Stopping expansion service pid: 2151855.
Skipping invalid pid: 2151856.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 2149762

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build.gradle'>
 line: 96

* What went wrong:
Execution failed for task ':sdks:python:bdistPy311linux'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 54m 21s
115 actionable tasks: 79 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/dxz6wc65zxtc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to