See
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/636/display/redirect?page=changes>
Changes:
[noreply] Workaround for incremental in read_json. (#13489)
------------------------------------------
[...truncated 38.13 MB...]
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:17.572Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:17.594Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:17.614Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:17.828Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:17.854Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:17.886Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.117Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/shuffle_operations.py", line 63, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 64, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 79, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 80, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "dataflow_worker/shuffle_operations.py", line 84, in
dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start
File "apache_beam/runners/worker/operations.py", line 359, in
apache_beam.runners.worker.operations.Operation.output
File "apache_beam/runners/worker/operations.py", line 221, in
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
File "dataflow_worker/shuffle_operations.py", line 261, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
File "dataflow_worker/shuffle_operations.py", line 272, in
dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process
ValueError: too many values to unpack (expected 3)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.202Z:
JOB_MESSAGE_BASIC: Finished operation
HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.329Z:
JOB_MESSAGE_DEBUG: Executing failure step failure64
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.375Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S11:HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. For more
information, see https://cloud.google.com/dataflow/docs/guides/common-errors.
The work item was attempted on these workers:
beamapp-jenkins-121206033-12112203-yh7k-harness-vstd
Root cause: Work item failed.,
beamapp-jenkins-121206033-12112203-yh7k-harness-76df
Root cause: Work item failed.,
beamapp-jenkins-121206033-12112203-yh7k-harness-7m0h
Root cause: Work item failed.,
beamapp-jenkins-121206033-12112203-yh7k-harness-fdnm
Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.499Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.726Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:23:18.753Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:24:44.431Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 70 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:24:44.499Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-12T06:24:44.523Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-12-11_22_03_41-11011118241025498374 is in state JOB_STATE_FAILED
google.auth._default: DEBUG: Checking None for explicit credentials as part of
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth
process...
google.auth._default: DEBUG: No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
HTTP/1.1" 200 241
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE
/bigquery/v2/projects/apache-beam-testing/datasets/hourly_team_score_it_dataset16077530028222?deleteContents=true&prettyPrint=false
HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_44-1687195000477317769?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_17_11-10401494259490180845?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_24_47-13662915994283171757?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_31_39-5440484690611193403?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_39_14-15832871879092233874?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_47_45-11800856610757695040?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_56_06-2334058938194539230?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_23_05_10-1537927450694857131?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_41-11011118241025498374?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_25_09-9361905705749670678?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_33_20-1586436600555408688?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_41_09-11532124106275146591?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_49_02-2779776373114128719?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_57_20-4423162241378112854?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_43-4278748297156094647?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_15_11-6401975139890551875?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_22_43-15691068127710948120?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_30_14-18112893069590654753?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_38_23-5793552423241409858?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_45_29-8458015691435396909?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_52_48-11970088479193132206?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_23_00_38-13335786260553911536?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_39-3802619924618315063?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_25_01-10652752494687204369?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_32_42-16299485566498471770?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_40_09-10203296757036708790?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_48_13-4848865054363254521?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_58_01-15287281603256387380?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_05_45-11628637934821537531?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_14_25-16822667318553403814?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_22_36-17153467154854418632?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_31_58-10112157705708372608?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_40_05-1158858969367445564?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_47_34-76128056988035146?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_54_12-5196495690401792102?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_23_02_26-12575706221709552743?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_23_10_45-1890135144700436571?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_40-3273765757528624117?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_12_06-5669088485608656060?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_21_23-11186432308490911169?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_31_24-15900191932325376173?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_39_04-16505380470962635344?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_46_34-973496472600865433?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_53_44-10030875515888496759?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_23_03_06-18421947956580087261?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_41-4628717732605413672?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_12_09-17621015907001355392?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_19_55-4049768574993399807?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_26_49-12917242295988745753?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_33_28-7155459480008797966?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_40_27-3455872923702433068?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_57_54-14334582424574314255?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_03_40-613658857970187903?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_12_33-17597994528975845581?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_20_56-3803677415307363272?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_29_20-10107288499779842703?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_37_17-17941714329478330169?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_45_04-2236582822215051236?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_22_52_39-11371463207802581?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-11_23_01_11-10008504368084245801?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4485.101s
FAILED (SKIP=7, errors=1)
> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 118
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 17m 33s
209 actionable tasks: 150 executed, 55 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/i45y5gubtjcfa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]