See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/98/display/redirect>

Changes:


------------------------------------------
[...truncated 167.74 KB...]
 stageStates: []
 startTime: '2023-04-21T08:02:25.588225Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job 
with id: [2023-04-21_01_02_25-16991506622588103799]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 
2023-04-21_01_02_25-16991506622588103799
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-21_01_02_25-16991506622588103799?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-21_01_02_25-16991506622588103799?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-04-21_01_02_25-16991506622588103799 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:26.483Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2023-04-21_01_02_25-16991506622588103799. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:26.630Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2023-04-21_01_02_25-16991506622588103799.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:28.708Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-f.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:29.966Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:29.991Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.047Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.078Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey:
 GroupByKey not followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.108Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.145Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.172Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.214Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.251Z: JOB_MESSAGE_DETAILED: Created new flatten 
external_1WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c19
 to unzip producers of 
external_1WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion35
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.271Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.292Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2178>) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.314Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2174>) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.336Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_1WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32
 for input 
external_1WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.358Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous), through flatten 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.384Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_1WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c19
 for input 
external_1WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.409Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous), through flatten 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.440Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.461Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.494Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/FlatMap(<lambda at core.py:3574>) into Create/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.515Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at 
core.py:3574>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.537Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into 
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.563Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.598Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.619Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.651Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) 
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.672Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.693Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.724Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2158>) into Create/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.756Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
 into WriteToBigQuery/Map(<lambda at bigquery.py:2158>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.778Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.798Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.829Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.863Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.885Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.906Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.938Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:30.969Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.003Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.035Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.066Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.097Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.130Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.162Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.204Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.228Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.259Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.280Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.422Z: JOB_MESSAGE_DEBUG: Executing wait step start38
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.486Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.534Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:31.584Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-f...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:32.007Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:32.072Z: JOB_MESSAGE_DEBUG: Value 
"Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" 
materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:32.126Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3574>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:02:57.328Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:03:29.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T08:05:20.310Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.183Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: The 
Dataflow job appears to be stuck because no worker activity has been seen in 
the last 1h. Please check the worker logs in Stackdriver Logging. You can also 
get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.242Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2023-04-21_01_02_25-16991506622588103799.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.257Z: JOB_MESSAGE_ERROR: The Dataflow job appears to be 
stuck because no worker activity has been seen in the last 1h. Please check the 
worker logs in Stackdriver Logging. You can also get help with Cloud Dataflow 
at https://cloud.google.com/dataflow/support.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.258Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3574>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.692Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.752Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:02:32.770Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:04:46.843Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:04:46.876Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-21T09:04:46.903Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-04-21_01_02_25-16991506622588103799 is in state JOB_STATE_FAILED
ERROR    
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1554 Console 
URL: 
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-21_01_02_25-16991506622588103799?project=<ProjectId>
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
 Deleting dataset python_xlang_storage_write1682064128f5c796 in project 
apache-beam-testing
___________ BigQueryXlangStorageWriteIT.test_storage_write_beam_rows 
___________

self = 
<apache_beam.io.external.xlang_bigqueryio_it_test.BigQueryXlangStorageWriteIT 
testMethod=test_storage_write_beam_rows>

    def test_storage_write_beam_rows(self):
      table_id = '{}:{}.python_xlang_storage_write_beam_rows'.format(
          self.project, self.dataset_id)
    
      row_elements = [
          beam.Row(
              my_int=e['int'],
              my_float=e['float'],
              my_numeric=e['numeric'],
              my_string=e['str'],
              my_bool=e['bool'],
              my_bytes=e['bytes'],
              my_timestamp=e['timestamp']) for e in self.ELEMENTS
      ]
    
      bq_matcher = BigqueryFullResultMatcher(
          project=self.project,
          query="SELECT * FROM %s" %
          '{}.python_xlang_storage_write_beam_rows'.format(self.dataset_id),
          data=self.parse_expected_data(self.ELEMENTS))
    
      with beam.Pipeline(argv=self.args) as p:
        _ = (
            p
            | beam.Create(row_elements)
>           | beam.io.StorageWriteToBigQuery(
                table=table_id, expansion_service=self.expansion_service))
E       AttributeError: module 'apache_beam.io' has no attribute 
'StorageWriteToBigQuery'

apache_beam/io/external/xlang_bigqueryio_it_test.py:232: 
AttributeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:816 
Dataset apache-beam-testing:python_xlang_storage_write16820679253072d8 does not 
exist so we will create it as temporary with location=None
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:107
 Created dataset python_xlang_storage_write16820679253072d8 in project 
apache-beam-testing
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:110
 expansion port: 45099
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
 Deleting dataset python_xlang_storage_write16820679253072d8 in project 
apache-beam-testing
=============================== warnings summary 
===============================
../../build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:121
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:121:
 DeprecationWarning: pkg_resources is deprecated as an API
    warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
 18 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
 13 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.cloud')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2349:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(parent)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.logging')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.iam')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py:20
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py>:20:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.rpc')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    pkg_resources.declare_namespace(__name__)

apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2029:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2035:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
The Dataflow job appears to be stuck because no worker activity has been seen 
in the last 1h. Please check the worker logs in Stackdriver Logging. You can 
also get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
FAILED 
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_beam_rows
 - AttributeError: module 'apache_beam.io' has no attribute 
'StorageWriteToBigQuery'
= 2 failed, 1 passed, 9 skipped, 6883 
deselected, 47 warnings in 4402.71s (1:13:22) =

> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava 
> FAILED

> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguageCleanup
Stopping expansion service pid: 3993925.
Skipping invalid pid: 3993926.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 3993996

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 1h 17m 1s
108 actionable tasks: 72 executed, 32 from cache, 4 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: 
com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
        at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
        at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
        at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
        at java.lang.reflect.WeakCache.get(WeakCache.java:127)
        at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
        at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
        at 
com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
        at 
com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
        at 
com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/iph5uxwludcv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to