See
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/1221/display/redirect?page=changes>
Changes:
[Ismaël Mejía] [BEAM-12329] Drain S3 read InputStreams to avoid warnings
[noreply] [BEAM-12320] Raise timeout and add logging in PubsubTableProviderIT.
------------------------------------------
[...truncated 43.75 MB...]
RuntimeError: BigQuery job
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_508_9d266ae1ed9595c3f8296732b669c702_2d781337bef84e569967a9c06ae73e22
failed. Error Result: <ErrorProto
message: 'Error encountered during execution. Retrying may solve the problem.'
reason: 'backendError'> [while running
'WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:50.950Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "apache_beam/runners/common.py", line 1233, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 762, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 887, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 730, in process
self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 565, in wait_for_bq_job
raise RuntimeError(
RuntimeError: BigQuery job
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_508_9d266ae1ed9595c3f8296732b669c702_2d781337bef84e569967a9c06ae73e22
failed. Error Result: <ErrorProto
message: 'Error encountered during execution. Retrying may solve the problem.'
reason: 'backendError'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 54, in
dataflow_worker.native_operations.NativeReadOperation.start
File "apache_beam/runners/worker/operations.py", line 358, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 220, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1315, in
apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 1233, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 762, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 887, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 730, in process
self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 565, in wait_for_bq_job
raise RuntimeError(
RuntimeError: BigQuery job
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_508_9d266ae1ed9595c3f8296732b669c702_2d781337bef84e569967a9c06ae73e22
failed. Error Result: <ErrorProto
message: 'Error encountered during execution. Retrying may solve the problem.'
reason: 'backendError'> [while running
'WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:51.433Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "apache_beam/runners/common.py", line 1233, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 762, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 887, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 730, in process
self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 565, in wait_for_bq_job
raise RuntimeError(
RuntimeError: BigQuery job
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_508_9d266ae1ed9595c3f8296732b669c702_2d781337bef84e569967a9c06ae73e22
failed. Error Result: <ErrorProto
message: 'Error encountered during execution. Retrying may solve the problem.'
reason: 'backendError'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 54, in
dataflow_worker.native_operations.NativeReadOperation.start
File "apache_beam/runners/worker/operations.py", line 358, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 220, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 717, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 718, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 1235, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1315, in
apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 1233, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 762, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 887, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 730, in process
self.bq_wrapper.wait_for_bq_job(ref, sleep_duration_sec=10, max_retries=0)
File
"/usr/local/lib/python3.8/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 565, in wait_for_bq_job
raise RuntimeError(
RuntimeError: BigQuery job
beam_bq_job_LOAD_AUTOMATIC_JOB_NAME_LOAD_STEP_508_9d266ae1ed9595c3f8296732b669c702_2d781337bef84e569967a9c06ae73e22
failed. Error Result: <ErrorProto
message: 'Error encountered during execution. Retrying may solve the problem.'
reason: 'backendError'> [while running
'WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs']
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:51.469Z:
JOB_MESSAGE_BASIC: Finished operation
WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:51.591Z:
JOB_MESSAGE_DEBUG: Executing failure step failure59
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:51.638Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S15:WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. For more
information, see https://cloud.google.com/dataflow/docs/guides/common-errors.
The work item was attempted on these workers:
beamapp-jenkins-051718501-05171150-bim7-harness-r5wx
Root cause: Work item failed.,
beamapp-jenkins-051718501-05171150-bim7-harness-r5wx
Root cause: Work item failed.,
beamapp-jenkins-051718501-05171150-bim7-harness-r5wx
Root cause: Work item failed.,
beamapp-jenkins-051718501-05171150-bim7-harness-r5wx
Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:52.064Z:
JOB_MESSAGE_WARNING:
S19:WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:52.100Z:
JOB_MESSAGE_BASIC: Finished operation
WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:52.191Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:52.434Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:57:52.468Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:58:44.436Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:58:44.552Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-05-17T18:58:44.597Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2021-05-17_11_50_32-5613988733990657014 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_test: INFO: Deleting dataset
python_bq_streaming_inserts_1621277414211 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_52-6171194516655355783?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_23_43-5412640631563383416?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_31_38-9961218925318927376?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_40_27-6227504239777460551?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_49_24-4790476890867885934?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_58_44-8389232760773492533?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_05_53-17635787115862907529?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_13_45-1395394777867535165?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_24_20-5640500471010186224?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_43-10539471265919208708?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_31_56-4599662827606411393?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_41_46-13174808057689349338?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_51_36-6447327944309694225?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_08_20-16991615107113719421?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_46-2266000416806339749?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_21_16-5082808385928949259?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_29_05-12839715064365770609?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_37_37-3879049640119684110?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_46_54-5474236662820621530?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_55_30-17056765027938844230?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_03_46-16574101692050496403?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_11_15-12844914762150978178?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_41-17546481631471539334?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_30_59-12299976045851812207?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_39_58-12850279137037415708?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_49_17-17111852942514762765?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_59_03-16023886407643639994?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_07_48-7724652799908543300?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_16_08-8127012952779604352?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_11_37-5580456557697017207?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_21_07-18002226936578184449?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_29_49-13294894015028663214?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_40_54-6921985486621011327?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_50_32-5613988733990657014?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_59_07-8031465022628386007?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_08_26-8227712700802742579?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_17_55-8672242170942499712?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_45-13890903031455589428?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_19_15-12360959059014906588?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_28_44-2768843429476421930?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_38_25-11522076634618764335?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_47_48-15669355305614695993?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_57_13-1846008884718078396?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_05_58-11720408513193115507?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_15_43-11682978724977339294?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_42-17718574294961175011?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_19_10-15938868119979849299?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_28_45-3281474409980074652?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_36_32-15589373791828159355?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_44_33-9288191975515863885?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_52_22-14289131225888793406?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_01_39-12521401614784189675?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_11_16-1644509501859795876?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_20_34-17930721844687529842?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_09_42-16390289910108690425?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_18_46-13187425175923947866?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_28_23-13219583352139591533?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_39_20-961423693150954779?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_47_42-8347271851157404549?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_11_56_49-13900429027274201118?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_04_49-3788702888470990078?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-17_12_13_40-9262238957635044710?project=apache-beam-testing
======================================================================
FAIL: test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/gcsio_integration_test.py",>
line 196, in test_copy_batch_rewrite_token
self.assertTrue(any([not r.done for r in rewrite_responses]))
AssertionError: False is not true
-------------------- >> begin captured logging << --------------------
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_2
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_6
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_8
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_1
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_7
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_0
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_5
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_4
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_3
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/test_copy_batch_rewrite_token_9
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-c2ad65e3-049e-42bb-a2ba-ea674c6c7cd0/*'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/gcs_it\\-c2ad65e3\\-049e\\-42bb\\-a2ba\\-ea674c6c7cd0/[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 10 files in
0.048615455627441406 seconds.
--------------------- >> end captured logging << ---------------------
======================================================================
FAIL: test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/gcsio_integration_test.py",>
line 143, in test_copy_rewrite_token
self.assertTrue(any([not r.done for r in rewrite_responses]))
AssertionError: False is not true
-------------------- >> begin captured logging << --------------------
apache_beam.io.gcp.gcsio: DEBUG: Rewrite done:
gs://dataflow-samples/wikipedia_edits/wiki_data-000000000000.json to
gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-f0953094-9882-48e3-be12-4058aac4e011/test_copy_rewrite_token
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-f0953094-9882-48e3-be12-4058aac4e011/'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/gcs_it-f0953094-9882-48e3-be12-4058aac4e011/*'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/gcs_it\\-f0953094\\-9882\\-48e3\\-be12\\-4058aac4e011/[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 1 files in 0.03502321243286133
seconds.
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 5019.801s
FAILED (SKIP=6, errors=1, failures=2)
> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 118
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 32m 32s
209 actionable tasks: 156 executed, 49 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/rlum5xkihwslu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]