See
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/2669/display/redirect?page=changes>
Changes:
[zyichi] [BEAM-10419] Skip FhirIORead integration test due to flakiness
[noreply] [BEAM-10371] Run dependency check script with Python 3 (#12132)
[noreply] [BEAM-10165] Cache and return error messages on pipeline failure.
------------------------------------------
[...truncated 12.54 MB...]
File "apache_beam/runners/common.py", line 1045, in
apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 961, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 726, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 812, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File "apache_beam/runners/common.py", line 1095, in
apache_beam.runners.common._OutputProcessor.process_outputs
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 403, in process
job_labels=self.bq_io_metadata.add_additional_bq_job_labels())
File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 365, in _insert_copy_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
RuntimeError: apitools.base.py.exceptions.HttpConflictError: HttpError
accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
response: <{'server': 'ESF', 'x-xss-protection': '0',
'x-content-type-options': 'nosniff', 'vary': 'Origin, X-Origin, Referer',
'content-length': '502', 'status': '409', 'content-type': 'application/json;
charset=UTF-8', 'x-frame-options': 'SAMEORIGIN', 'date': 'Wed, 08 Jul 2020
18:30:15 GMT', 'cache-control': 'private', '-content-encoding': 'gzip',
'transfer-encoding': 'chunked'}>, content <{
"error": {
"code": 409,
"message": "Already Exists: Job
apache-beam-testing:US.beam_load_2020_07_08_182601_93_copy_cc930c8759b7c408085ffac5d2444068_to_2641451d0e03b61ae8580df981bbb9b6",
"errors": [
{
"message": "Already Exists: Job
apache-beam-testing:US.beam_load_2020_07_08_182601_93_copy_cc930c8759b7c408085ffac5d2444068_to_2641451d0e03b61ae8580df981bbb9b6",
"domain": "global",
"reason": "duplicate"
}
],
"status": "ALREADY_EXISTS"
}
}
> [while running
> 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)']
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:22.072Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "apache_beam/runners/common.py", line 961, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 726, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 812, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File "apache_beam/runners/common.py", line 1095, in
apache_beam.runners.common._OutputProcessor.process_outputs
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 403, in process
job_labels=self.bq_io_metadata.add_additional_bq_job_labels())
File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 365, in _insert_copy_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpConflictError: HttpError accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
response: <{'server': 'ESF', 'x-xss-protection': '0',
'x-content-type-options': 'nosniff', 'vary': 'Origin, X-Origin, Referer',
'content-length': '502', 'status': '409', 'content-type': 'application/json;
charset=UTF-8', 'x-frame-options': 'SAMEORIGIN', 'date': 'Wed, 08 Jul 2020
18:30:20 GMT', 'cache-control': 'private', '-content-encoding': 'gzip',
'transfer-encoding': 'chunked'}>, content <{
"error": {
"code": 409,
"message": "Already Exists: Job
apache-beam-testing:US.beam_load_2020_07_08_182601_93_copy_cc930c8759b7c408085ffac5d2444068_to_2641451d0e03b61ae8580df981bbb9b6",
"errors": [
{
"message": "Already Exists: Job
apache-beam-testing:US.beam_load_2020_07_08_182601_93_copy_cc930c8759b7c408085ffac5d2444068_to_2641451d0e03b61ae8580df981bbb9b6",
"domain": "global",
"reason": "duplicate"
}
],
"status": "ALREADY_EXISTS"
}
}
>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py",
line 638, in do_work
work_executor.execute()
File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 54, in
dataflow_worker.native_operations.NativeReadOperation.start
File "apache_beam/runners/worker/operations.py", line 332, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 138, in
apache_beam.runners.worker.operations.ConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1030, in
apache_beam.runners.common.DoFnRunner._reraise_augmented
File "apache_beam/runners/common.py", line 961, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 726, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 812, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File "apache_beam/runners/common.py", line 1122, in
apache_beam.runners.common._OutputProcessor.process_outputs
File "apache_beam/runners/worker/operations.py", line 195, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "apache_beam/runners/worker/operations.py", line 670, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/worker/operations.py", line 671, in
apache_beam.runners.worker.operations.DoOperation.process
File "apache_beam/runners/common.py", line 963, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 1045, in
apache_beam.runners.common.DoFnRunner._reraise_augmented
File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "apache_beam/runners/common.py", line 961, in
apache_beam.runners.common.DoFnRunner.process
File "apache_beam/runners/common.py", line 726, in
apache_beam.runners.common.PerWindowInvoker.invoke_process
File "apache_beam/runners/common.py", line 812, in
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
File "apache_beam/runners/common.py", line 1095, in
apache_beam.runners.common._OutputProcessor.process_outputs
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
line 403, in process
job_labels=self.bq_io_metadata.add_additional_bq_job_labels())
File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 365, in _insert_copy_job
response = self.client.jobs.Insert(request)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
line 346, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
RuntimeError: apitools.base.py.exceptions.HttpConflictError: HttpError
accessing
<https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
response: <{'server': 'ESF', 'x-xss-protection': '0',
'x-content-type-options': 'nosniff', 'vary': 'Origin, X-Origin, Referer',
'content-length': '502', 'status': '409', 'content-type': 'application/json;
charset=UTF-8', 'x-frame-options': 'SAMEORIGIN', 'date': 'Wed, 08 Jul 2020
18:30:20 GMT', 'cache-control': 'private', '-content-encoding': 'gzip',
'transfer-encoding': 'chunked'}>, content <{
"error": {
"code": 409,
"message": "Already Exists: Job
apache-beam-testing:US.beam_load_2020_07_08_182601_93_copy_cc930c8759b7c408085ffac5d2444068_to_2641451d0e03b61ae8580df981bbb9b6",
"errors": [
{
"message": "Already Exists: Job
apache-beam-testing:US.beam_load_2020_07_08_182601_93_copy_cc930c8759b7c408085ffac5d2444068_to_2641451d0e03b61ae8580df981bbb9b6",
"domain": "global",
"reason": "duplicate"
}
],
"status": "ALREADY_EXISTS"
}
}
> [while running
> 'WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)']
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:22.546Z:
JOB_MESSAGE_BASIC: Finished operation
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:22.626Z:
JOB_MESSAGE_DEBUG: Executing failure step failure102
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:22.665Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S53:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. For more
information, see https://cloud.google.com/dataflow/docs/guides/common-errors.
The work item was attempted on these workers:
beamapp-jenkins-070818195-07081120-w15l-harness-nwl8
Root cause: Work item failed.,
beamapp-jenkins-070818195-07081120-w15l-harness-nwl8
Root cause: Work item failed.,
beamapp-jenkins-070818195-07081120-w15l-harness-nwl8
Root cause: Work item failed.,
beamapp-jenkins-070818195-07081120-w15l-harness-nwl8
Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:22.778Z:
JOB_MESSAGE_BASIC: Finished operation
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber+WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteWithMultipleDests/BigQueryBatchFileLoads/IdentityWorkaround+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:22.912Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:23.183Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:30:23.221Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:31:13.955Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:31:13.993Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-08T18:31:14.031Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-07-08_11_20_17-2250184070716748403 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_file_loads_test: INFO: Deleting dataset
python_bq_file_loads_15942323528349 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_17-2876241385914063099?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_16_57-18089494835420364331?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_24_02-1527725735934694475?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_31_15-9864948035595954070?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_38_30-9407470761447643490?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_45_40-15212505545577077790?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_52_40-125349470697914876?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_11-18398438515598581500?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_23_37-13976018099938291422?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_31_08-12524877538019462636?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_38_00-6791040228138090120?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_45_44-10330403821347260106?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_53_04-17234657750532500673?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_15-17044701692202320678?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_15_04-4668330123324819542?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_22_39-13864872859460775412?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_30_41-10067991973820161490?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_38_19-6610231382162472965?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_45_57-10560103099969283881?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_53_23-14262277488179529559?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_11-2290225324929029679?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_20_20-14742810867520634026?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_28_16-13451825035854588213?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_35_45-10708433956342842419?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_43_17-17623214461373795554?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_51_32-10234643668362254701?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_12-8941058307387481107?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_11_46-15690392778680357672?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_20_17-2250184070716748403?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_31_58-13186031155863679681?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_40_42-881193302712309561?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_47_39-17665482907072985965?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_54_37-10086093442484238954?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_12_01_52-12503754429342057932?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_12-12127136508448959350?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_10_39-5294271229105954367?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_19_22-5793835742200904506?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_26_43-8750287348205548300?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_33_33-6627261409756294886?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_40_41-12941990733541592172?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_48_33-16415323388898395026?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_56_21-2239302755229114319?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_12-17863696145243793673?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_11_26-10841498642551357947?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_19_57-6199008288140416105?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_28_07-3995582443478770640?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_35_41-1005698085062076848?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_43_28-10345915887409261444?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_50_27-10883321418272226378?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_58_22-9800044771125323912?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_03_13-18292022639217603766?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_12_42-8169307851870657147?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_22_41-5831184428473584797?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_30_57-17956231861092720613?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_38_07-1543806955628267668?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-08_11_55_01-5559346786384132076?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 3961.344s
FAILED (SKIP=7, errors=1)
> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 116
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 8m 27s
111 actionable tasks: 79 executed, 32 from cache
Publishing build scan...
https://gradle.com/s/u2y5g24kvpnsc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]