See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/92/display/redirect>
Changes: ------------------------------------------ [...truncated 137.45 KB...] name: 'beamapp-jenkins-0419201046-006393-tkribgsc' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-04-19T20:10:52.909976Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job with id: [2023-04-19_13_10_52-10484224546389804657] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 2023-04-19_13_10_52-10484224546389804657 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-19_13_10_52-10484224546389804657?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-19_13_10_52-10484224546389804657?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-19_13_10_52-10484224546389804657?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-04-19_13_10_52-10484224546389804657 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:54.556Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2023-04-19_13_10_52-10484224546389804657. The number of workers will be between 1 and 1000. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:54.630Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2023-04-19_13_10_52-10484224546389804657. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:56.787Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:57.999Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.030Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.096Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.124Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.163Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.191Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.234Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.254Z: JOB_MESSAGE_DETAILED: Created new flatten external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10 to unzip producers of external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion35 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.284Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.311Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/Map(<lambda at bigquery.py:2178>) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.335Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/Map(<lambda at bigquery.py:2174>) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.369Z: JOB_MESSAGE_DETAILED: Unzipping flatten external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32 for input external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.391Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous), through flatten WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors, into producer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.422Z: JOB_MESSAGE_DETAILED: Unzipping flatten external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10 for input external_2WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.456Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous), through flatten WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors, into producer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.489Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.521Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.553Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:3574>) into Create/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.578Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/FlatMap(<lambda at core.py:3574>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.606Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/Map(<lambda at bigquery.py:2158>) into Create/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.632Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter) into WriteToBigQuery/Map(<lambda at bigquery.py:2158>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.655Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.687Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.710Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.746Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.776Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.806Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.828Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.856Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.877Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.899Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.922Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.945Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:58.974Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.008Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize writes/ParMultiDo(StorageApiFinalizeWrites) into WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.053Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.085Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.116Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.148Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.314Z: JOB_MESSAGE_DEBUG: Executing wait step start28 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.382Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.430Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.450Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.668Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.731Z: JOB_MESSAGE_DEBUG: Value "WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:10:59.794Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at bigquery.py:2158>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda at bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda at bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda at bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda at bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:11:14.396Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:11:39.423Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:13:38.660Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:02.907Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:09.118Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at bigquery.py:2158>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert to message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda at bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda at bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda at bigquery.py:2174>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda at bigquery.py:2178>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:09.203Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:09.890Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:09.952Z: JOB_MESSAGE_BASIC: Executing operation WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize writes/ParMultiDo(StorageApiFinalizeWrites) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:10.901Z: JOB_MESSAGE_BASIC: Finished operation WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize writes/ParMultiDo(StorageApiFinalizeWrites) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:10.975Z: JOB_MESSAGE_DEBUG: Executing success step success26 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:11.047Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:11.095Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:18:11.122Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:20:46.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:20:46.268Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-19T20:20:46.293Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-04-19_13_10_52-10484224546389804657 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:121 Attempting to perform query SELECT * FROM python_xlang_storage_write1681935042fc02a5.python_storage_write_nested_records_and_lists to BQ [32mINFO [0m apache_beam.io.gcp.tests.bigquery_matcher:bigquery_matcher.py:158 Result of query is: [([1, 2, 3], {'nested_int': 1, 'nested_str': 'a'}, [{'nested_numeric': Decimal('1.23'), 'nested_bytes': b'a'}, {'nested_numeric': Decimal('3.21'), 'nested_bytes': b'aa'}])] [32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117 Deleting dataset python_xlang_storage_write1681935042fc02a5 in project apache-beam-testing [32mPASSED[0m =================================== FAILURES =================================== [31m[1m___________ BigQueryXlangStorageWriteIT.test_storage_write_beam_rows ___________[0m self = <apache_beam.io.external.xlang_bigqueryio_it_test.BigQueryXlangStorageWriteIT testMethod=test_storage_write_beam_rows> def test_storage_write_beam_rows(self): table_id = '{}:{}.python_xlang_storage_write_beam_rows'.format( self.project, self.dataset_id) row_elements = [ beam.Row( my_int=e['int'], my_float=e['float'], my_numeric=e['numeric'], my_string=e['str'], my_bool=e['bool'], my_bytes=e['bytes'], my_timestamp=e['timestamp']) for e in self.ELEMENTS ] bq_matcher = BigqueryFullResultMatcher( project=self.project, query="SELECT * FROM %s" % '{}.python_xlang_storage_write_beam_rows'.format(self.dataset_id), data=self.parse_expected_data(self.ELEMENTS)) with beam.Pipeline(argv=self.args) as p: _ = ( p | beam.Create(row_elements) > | beam.io.StorageWriteToBigQuery( table=table_id, expansion_service=self.expansion_service)) [1m[31mE AttributeError: module 'apache_beam.io' has no attribute 'StorageWriteToBigQuery'[0m [1m[31mapache_beam/io/external/xlang_bigqueryio_it_test.py[0m:232: AttributeError ------------------------------ Captured log call ------------------------------- [32mINFO [0m apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:816 Dataset apache-beam-testing:python_xlang_storage_write168193504082c175 does not exist so we will create it as temporary with location=None [32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:107 Created dataset python_xlang_storage_write168193504082c175 in project apache-beam-testing [32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:110 expansion port: 39567 [32mINFO [0m apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117 Deleting dataset python_xlang_storage_write168193504082c175 in project apache-beam-testing [33m=============================== warnings summary ===============================[0m ../../build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:121 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning) ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870: 18 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870: 13 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349 ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349 ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349 ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2349: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(parent) ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.iam')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py:20 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py>:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages pkg_resources.declare_namespace(__name__) apache_beam/typehints/pandas_type_compatibility_test.py:67 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. }).set_index(pd.Int64Index(range(123, 223), name='an_index')), apache_beam/typehints/pandas_type_compatibility_test.py:90 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. pd.Int64Index(range(123, 223), name='an_index'), apache_beam/typehints/pandas_type_compatibility_test.py:91 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. pd.Int64Index(range(475, 575), name='another_index'), apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2029: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2035: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml> - [36m[1m=========================== short test summary info ============================[0m [31mFAILED[0m apache_beam/io/external/xlang_bigqueryio_it_test.py::[1mBigQueryXlangStorageWriteIT::test_storage_write_beam_rows[0m - AttributeError: module 'apache_beam.io' has no attribute 'StorageWriteToBigQuery' [31m= [31m[1m1 failed[0m, [32m2 passed[0m, [33m9 skipped[0m, [33m6875 deselected[0m, [33m47 warnings[0m[31m in 1241.52s (0:20:41)[0m[31m =[0m > Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava > FAILED > Task :sdks:python:test-suites:dataflow:py37:gcpCrossLanguageCleanup Stopping expansion service pid: 656391. Skipping invalid pid: 656392. > Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup Killing process at 656551 FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:gcpCrossLanguagePythonUsingJava'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 22m 41s 108 actionable tasks: 74 executed, 30 from cache, 4 up-to-date Build scan background action failed. java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590) at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557) at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230) at java.lang.reflect.WeakCache.get(WeakCache.java:127) at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419) at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719) at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64) at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59) at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Publishing build scan... https://gradle.com/s/tgk5tzgsq4pjs Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
