See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/1058/display/redirect>

Changes:


------------------------------------------
[...truncated 42.81 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:06.043Z: 
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:06.077Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:15.617Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:15.682Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:15.715Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:15.783Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:15.839Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:15.943Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:30:16.006Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:08.373Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:13.377Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:13.441Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:13.504Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:13.558Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:13.620Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:13.686Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:15.158Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:17.648Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:21.586Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:21.689Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:21.758Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:21.793Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:21.848Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:21.935Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:22.656Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", 
line 179, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 358, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 220, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 717, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 718, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 1235, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 1315, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 
446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 1233, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 762, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 887, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 734, in process
    self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
  File 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 542, in wait_for_bq_job
    raise RuntimeError(
RuntimeError: BigQuery job 
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_801_c8759e21412c2594f5a503846d281ea5_a1c22357da924c3ea245744e404c89a3
 failed. Error Result: <ErrorProto
 message: 'Error encountered during execution. Retrying may solve the problem.'
 reason: 'backendError'> [while running 
'write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:22.677Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:22.744Z: 
JOB_MESSAGE_DEBUG: Executing failure step failure51
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:22.779Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: 
S14:write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
 failed., The job failed because a work item has failed 4 times. Look in 
previous log entries for the cause of each one of the 4 failures. For more 
information, see https://cloud.google.com/dataflow/docs/guides/common-errors. 
The work item was attempted on these workers: 
  beamapp-jenkins-040500224-04041722-0zum-harness-0qtg
      Root cause: Work item failed.,
  beamapp-jenkins-040500224-04041722-0zum-harness-0qtg
      Root cause: Work item failed.,
  beamapp-jenkins-040500224-04041722-0zum-harness-0qtg
      Root cause: Work item failed.,
  beamapp-jenkins-040500224-04041722-0zum-harness-0qtg
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:23.055Z: 
JOB_MESSAGE_WARNING: 
S28:write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
 failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:23.102Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:23.164Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:23.364Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:33:23.415Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:34:12.898Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:34:12.945Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-04-05T00:34:12.975Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2021-04-04_17_22_57-422575476091614295 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4913.962s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 31s
208 actionable tasks: 148 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
https://gradle.com/s/sbzvqhnzq6obw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to