See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/87/display/redirect>
Changes:
------------------------------------------
[...truncated 426.05 KB...]
# if wait_until_finish was called after the pipeline completed.
if terminated and self.state != PipelineState.DONE:
# TODO(BEAM-1290): Consider converting this to an error log based on
# theresolution of the issue.
_LOGGER.error(consoleUrl)
> raise DataflowRuntimeException(
'Dataflow pipeline failed. State: %s, Error:\n%s' %
(self.state, getattr(self._runner, 'last_error_msg', None)),
self)
[1m[31mE
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow
pipeline failed. State: FAILED, Error:[0m
[1m[31mE Workflow failed.[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:1555:
DataflowRuntimeException
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:816
Dataset apache-beam-testing:python_xlang_storage_write1681829773c4670e does not
exist so we will create it as temporary with location=None
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:107
Created dataset python_xlang_storage_write1681829773c4670e in project
apache-beam-testing
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:110
expansion port: 43777
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:396
Automatically enabling Dataflow Runner V2 since the pipeline used
cross-language transforms.
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:330 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz">
to staging location.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:443 Pipeline
has additional dependencies to be installed in SDK worker container, consider
using the SDK container image pre-building workflow to avoid repetitive
installations. Learn more on
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
[32mINFO [0m root:environments.py:376 Default Python SDK image for
environment is apache/beam_python3.8_sdk:2.48.0.dev
[32mINFO [0m root:environments.py:295 Using provided Python SDK container
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20230412
[32mINFO [0m root:environments.py:302 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20230412" for Docker
environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function pack_combiners at 0x7f74f9adcca0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710
==================== <function sort_stages at 0x7f74f9ad64c0>
====================
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:466 Defaulting to
the temp_location as staging_location:
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/icedtea-sound-z_dvIuvU8ZJnx9HLpkjsVJ0kTR-os7D9h5_Nx5_eLfw.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/icedtea-sound-z_dvIuvU8ZJnx9HLpkjsVJ0kTR-os7D9h5_Nx5_eLfw.jar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/jaccess-kuSyBwlgDsM1YE_9LwCkEOL6tkTzJnNBCQLIiRHF-fA.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/jaccess-kuSyBwlgDsM1YE_9LwCkEOL6tkTzJnNBCQLIiRHF-fA.jar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/localedata-Z2uTZhqFCBVf49D9vip6L4CY0Wv7maMX7YRS3EyoZMs.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/localedata-Z2uTZhqFCBVf49D9vip6L4CY0Wv7maMX7YRS3EyoZMs.jar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/nashorn-ZL-RSHx2intMJnY-DlHLkl4aTpBA1nxpeb5V08uwIuY.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/nashorn-ZL-RSHx2intMJnY-DlHLkl4aTpBA1nxpeb5V08uwIuY.jar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/cldrdata-92KZgYL5H6sA_g9Uh_eHsKuDAAuiqVYK6dogst4-BgQ.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/cldrdata-92KZgYL5H6sA_g9Uh_eHsKuDAAuiqVYK6dogst4-BgQ.jar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/dnsns-xElh83ynN6pQ_xgB81ysi9eP9tAQ8HKBfTijyCpm-e0.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/dnsns-xElh83ynN6pQ_xgB81ysi9eP9tAQ8HKBfTijyCpm-e0.jar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/beam-sdks-java-io-google-cloud-platform-expansion-service-2.48.0-SNAPSHOT-TS9IpC3JOBQvQ1BquzMcJMQr3lAgifpwDNtyY6uUDGM.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/beam-sdks-java-io-google-cloud-platform-expansion-service-2.48.0-SNAPSHOT-TS9IpC3JOBQvQ1BquzMcJMQr3lAgifpwDNtyY6uUDGM.jar
in 5 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/dataflow_python_sdk.tar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/dataflow_python_sdk.tar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:750 Completed GCS
upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0418145616-727880-myffu39d.1681829776.728225/pipeline.pb
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:909 Create job:
<Job
clientRequestId: '20230418145616729059-2387'
createTime: '2023-04-18T14:56:24.265146Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-04-18_07_56_23-1445943385968826043'
location: 'us-central1'
name: 'beamapp-jenkins-0418145616-727880-myffu39d'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-04-18T14:56:24.265146Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job
with id: [2023-04-18_07_56_23-1445943385968826043]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job:
2023-04-18_07_56_23-1445943385968826043
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-18_07_56_23-1445943385968826043?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58
Console log:
[32mINFO [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-18_07_56_23-1445943385968826043?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job
2023-04-18_07_56_23-1445943385968826043 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:25.928Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2023-04-18_07_56_23-1445943385968826043. The number of workers will be between
1 and 1000.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:25.961Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2023-04-18_07_56_23-1445943385968826043.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:28.371Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.632Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.657Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.698Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.727Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey:
GroupByKey not followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.761Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.788Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.824Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.842Z: JOB_MESSAGE_DETAILED: Created new flatten
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10
to unzip producers of
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion35
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.866Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows/Map/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.889Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/Map(<lambda at bigquery.py:2179>) into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.921Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/Map(<lambda at bigquery.py:2175>) into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.948Z: JOB_MESSAGE_DETAILED: Unzipping flatten
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32
for input
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.971Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous), through flatten
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
into producer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to message
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:29.996Z: JOB_MESSAGE_DETAILED: Unzipping flatten
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion32-c10
for input
external_3WriteToBigQuery-StorageWriteToBigQuery-SchemaAwareExternalTransform-ExternalTransform-beam-expansion10.failedRows
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.024Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows/Map/ParMultiDo(Anonymous), through flatten
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/flattenErrors,
into producer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to message
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.063Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows/Map/ParMultiDo(Anonymous) into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.088Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous) into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.118Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/FlatMap(<lambda at core.py:3574>) into Create/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.150Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/Map(decode) into Create/FlatMap(<lambda at core.py:3574>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.183Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/Map(<lambda at bigquery.py:2159>) into Create/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.210Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
into WriteToBigQuery/Map(<lambda at bigquery.py:2159>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.240Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.262Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.293Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.322Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to message into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.356Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to message
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.383Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.412Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.441Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.465Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.495Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.527Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/GroupByWindow
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.560Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.583Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.617Z: JOB_MESSAGE_DETAILED: Fusing consumer
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Finalize
writes/ParMultiDo(StorageApiFinalizeWrites) into
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.673Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.705Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.728Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.759Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.903Z: JOB_MESSAGE_DEBUG: Executing wait step start28
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.952Z: JOB_MESSAGE_BASIC: Executing operation
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:30.995Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:31.018Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:31.410Z: JOB_MESSAGE_BASIC: Finished operation
WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:31.474Z: JOB_MESSAGE_DEBUG: Value
"WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Session"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:56:31.539Z: JOB_MESSAGE_BASIC: Executing operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at
bigquery.py:2159>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2179>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2175>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2175>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2179>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:01.848Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:13.311Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone
us-central1-b failed to bring up any of the desired 1 workers. Please refer to
https://cloud.google.com/dataflow/docs/guides/common-errors#worker-pool-failure
for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance
'beamapp-jenkins-041814561-04180756-doxh-harness-xbbj' creation failed: The
zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough
resources available to fulfill the request. '(resource type:compute)'.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:13.333Z: JOB_MESSAGE_ERROR: Workflow failed.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:13.418Z: JOB_MESSAGE_BASIC: Finished operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:3574>)+Create/Map(decode)+WriteToBigQuery/Map(<lambda at
bigquery.py:2159>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/element-count/ParMultiDo(ElementCounter)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/PrepareWrite/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/rewindowIntoGlobal/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/CreateTables/CreateTables/ParMultiDo(CreateTables)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/Convert/Convert
to
message+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2179>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2175>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Write
Records+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed
rows/Map/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/error-count/ParMultiDo(ElementCounter)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2175>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/Construct
failed rows and errors/Map/ParMultiDo(Anonymous)+WriteToBigQuery/Map(<lambda
at
bigquery.py:2179>)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/Window.Into()/Window.Assign+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Reify+WriteToBigQuery/StorageWriteToBigQuery/SchemaAwareExternalTransform/ExternalTransform(beam:expansion:payload:schematransform:v1)/BigQueryIO.Write/StorageApiLoads/StorageApiWriteUnsharded/Reshuffle/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:13.496Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:13.602Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:13.624Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:39.722Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238
2023-04-18T14:57:39.754Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job
2023-04-18_07_56_23-1445943385968826043 is in state JOB_STATE_FAILED
[1m[31mERROR [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1554 Console
URL:
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-18_07_56_23-1445943385968826043?project=<ProjectId>
[32mINFO [0m
apache_beam.io.external.xlang_bigqueryio_it_test:xlang_bigqueryio_it_test.py:117
Deleting dataset python_xlang_storage_write1681829773c4670e in project
apache-beam-testing
[33m=============================== warnings summary
===============================[0m
../../build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
from imp import load_source
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:121
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:121:
DeprecationWarning: pkg_resources is deprecated as an API
warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
18 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870:
13 warnings
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.cloud')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2349
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2349:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(parent)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.logging')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py:2870
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/pkg_resources/__init__.py>:2870:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.iam')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
declare_namespace(pkg)
../../build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py:20
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/google/rpc/__init__.py>:20:
DeprecationWarning: Deprecated call to
`pkg_resources.declare_namespace('google.rpc')`.
Implementing implicit namespace packages (as specified in PEP 420) is
preferred to `pkg_resources.declare_namespace`. See
https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
pkg_resources.declare_namespace(__name__)
apache_beam/typehints/pandas_type_compatibility_test.py:67
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
}).set_index(pd.Int64Index(range(123, 223), name='an_index')),
apache_beam/typehints/pandas_type_compatibility_test.py:90
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(123, 223), name='an_index'),
apache_beam/typehints/pandas_type_compatibility_test.py:91
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas
in a future version. Use pandas.Index with the appropriate dtype instead.
pd.Int64Index(range(475, 575), name='another_index'),
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2030:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_all_types
apache_beam/io/external/xlang_bigqueryio_it_test.py::BigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2036:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
-
[36m[1m=========================== short test summary info
============================[0m
[31mFAILED[0m
apache_beam/io/external/xlang_bigqueryio_it_test.py::[1mBigQueryXlangStorageWriteIT::test_storage_write_nested_records_and_lists[0m
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException:
Dataflow pipeline failed. State: FAILED, Error:
Workflow failed.
[31m= [31m[1m1 failed[0m, [32m2 passed[0m, [33m9 skipped[0m, [33m6874
deselected[0m, [33m47 warnings[0m[31m in 1379.53s (0:22:59)[0m[31m =[0m
> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava
> FAILED
> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 1642119.
Skipping invalid pid: 1642120.
> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 1642267
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 26m 55s
103 actionable tasks: 13 executed, 90 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException:
com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
at java.lang.reflect.WeakCache.get(WeakCache.java:127)
at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
at
com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
at
com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
at
com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Publishing build scan...
https://gradle.com/s/e65frrgpzuwr2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]