See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/116/display/redirect>

Changes:


------------------------------------------
[...truncated 135.74 KB...]
 name: 'beamapp-jenkins-0425201019-354383-kribgsco'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-25T20:10:27.355401Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job 
with id: [2023-04-25_13_10_25-11090264723682642103]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 
2023-04-25_13_10_25-11090264723682642103
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-25_13_10_25-11090264723682642103?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-25_13_10_25-11090264723682642103?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-25_13_10_25-11090264723682642103?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-04-25_13_10_25-11090264723682642103 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:30.176Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2023-04-25_13_10_25-11090264723682642103. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:30.426Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2023-04-25_13_10_25-11090264723682642103.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:33.158Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-f.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.572Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.609Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.673Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.706Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey:
 GroupByKey not followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.742Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.769Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.803Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.836Z: JOB_MESSAGE_DETAILED: Created new flatten 
external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10
 to unzip producers of 
external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion35
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.871Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.906Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2178>) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.940Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2174>) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:34.973Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32
 for input 
external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.007Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous), through flatten 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.041Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10
 for input 
external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.072Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous), through flatten 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
 into producer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.107Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows/Map/ParMultiDo(Anonymous) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.138Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.161Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/FlatMap(<lambda at core.py:3574>) into Create/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.192Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Create/Map(decode) into Create/FlatMap(<lambda at core.py:3574>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.230Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/Map(<lambda at bigquery.py:2158>) into Create/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.262Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
 into WriteToBigQuery/Map(<lambda at bigquery.py:2158>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.296Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.329Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.362Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.393Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.425Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to message
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.458Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 Records
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.490Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.523Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.557Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.588Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.625Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.661Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.694Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.725Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites) into 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.771Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.804Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.837Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:35.871Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.035Z: JOB_MESSAGE_DEBUG: Executing wait step start28
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.104Z: JOB_MESSAGE_BASIC: Executing operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.152Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.182Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-f...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.412Z: JOB_MESSAGE_BASIC: Finished operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.477Z: JOB_MESSAGE_DEBUG: Value 
"WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:10:36.545Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at 
bigquery.py:2158>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:11:01.179Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:11:19.586Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:14:51.001Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:08.867Z: JOB_MESSAGE_DETAILED: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:18.365Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at 
bigquery.py:2158>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
 to 
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
 
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed 
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
 at 
bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
 failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda 
at 
bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:18.503Z: JOB_MESSAGE_BASIC: Executing operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:18.781Z: JOB_MESSAGE_BASIC: Finished operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:18.845Z: JOB_MESSAGE_BASIC: Executing operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:19.800Z: JOB_MESSAGE_BASIC: Finished operation 
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
 writes/ParMultiDo(StorageApiFinalizeWrites)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:19.874Z: JOB_MESSAGE_DEBUG: Executing success step success26
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:19.948Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:20.004Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:16:20.031Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:18:34.832Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:18:34.864Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2023-04-25T20:18:34.896Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2023-04-25_13_10_25-11090264723682642103 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to 
perform query SELECT * FROM 
python_xlang_storage_write16824534155ebb25.python_storage_write_nested_records_and_lists
 to BQ
INFO     
apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of 
query is: [([1, 2, 3], {'nested_int': 1, 'nested_str': 'a'}, 
[{'nested_numeric': Decimal('1.23'), 'nested_bytes': b'a'}, {'nested_numeric': 
Decimal('3.21'), 'nested_bytes': b'aa'}])]
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
 Deleting dataset python_xlang_storage_write16824534155ebb25 in project 
apache-beam-testing
PASSED

=================================== FAILURES ===================================
___________ BigQueryXlangStorageWriteIT.test_storage_write_beam_rows 
___________

self = 
<apache_beam.io.external.xlang_bigqueryio_it_test.BigQueryXlangStorageWriteIT 
testMethod=test_storage_write_beam_rows>

    def test_storage_write_beam_rows(self):
      table_id = '{}:{}.python_xlang_storage_write_beam_rows'.format(
          self.project, self.dataset_id)
    
      row_elements = [
          beam.Row(
              my_int=e['int'],
              my_float=e['float'],
              my_numeric=e['numeric'],
              my_string=e['str'],
              my_bool=e['bool'],
              my_bytes=e['bytes'],
              my_timestamp=e['timestamp']) for e in self.ELEMENTS
      ]
    
      bq_matcher = BigqueryFullResultMatcher(
          project=self.project,
          query="SELECT * FROM %s" %
          '{}.python_xlang_storage_write_beam_rows'.format(self.dataset_id),
          data=self.parse_expected_data(self.ELEMENTS))
    
      with beam.Pipeline(argv=self.args) as p:
        _ = (
            p
            | beam.Create(row_elements)
>           | beam.io.StorageWriteToBigQuery(
                table=table_id, expansion_service=self.expansion_service))
E       AttributeError: module 'apache_beam.io' has no attribute 
'StorageWriteToBigQuery'

apache_beam/io/external/xlang_bigqueryio_it_test.py:232: 
AttributeError
------------------------------ Captured log call -------------------------------
INFO     apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:816 
Dataset apache-beam-testing:python_xlang_storage_write168245341435c765 does not 
exist so we will create it as temporary with location=None
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:107
 Created dataset python_xlang_storage_write168245341435c765 in project 
apache-beam-testing
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:110
 expansion port: 40415
INFO     
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
 Deleting dataset python_xlang_storage_write168245341435c765 in project 
apache-beam-testing
=============================== warnings summary 
===============================
../../build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:121
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:121:
 DeprecationWarning: pkg_resources is deprecated as an API
    warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
 18 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
 13 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.cloud')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2349:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(parent)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.logging')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.iam')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

../../build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py:20
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py>:20:
 DeprecationWarning: Deprecated call to 
`pkg_resources.declare_namespace('google.rpc')`.
  Implementing implicit namespace packages (as specified in PEP 420) is 
preferred to `pkg_resources.declare_namespace`. See 
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    pkg_resources.declare_namespace(__name__)

apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2029:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2035:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_beam_rows
 - AttributeError: module 'apache_beam.io' has no attribute 
'StorageWriteToBigQuery'
= 1 failed, 2 passed, 9 skipped, 6893 
deselected, 47 warnings in 1114.41s (0:18:34) =

> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava 
> FAILED

> Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguageCleanup
Stopping expansion service pid: 3154165.
Skipping invalid pid: 3154166.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 3154217

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 20m 23s
108 actionable tasks: 72 executed, 32 from cache, 4 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: 
com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
        at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
        at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
        at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
        at java.lang.reflect.WeakCache.get(WeakCache.java:127)
        at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
        at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
        at 
com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
        at 
com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
        at 
com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/ebf6uwlyaoguu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to