See
<https://builds.apache.org/job/beam_PostCommit_Python2/2417/display/redirect>
Changes:
------------------------------------------
[...truncated 12.64 MB...]
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:35:45.340Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:36:02.149Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:36:33.070Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:00.934Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python2.7/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python2.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python2.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.446Z:
JOB_MESSAGE_BASIC: Finished operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.530Z:
JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.557Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., Internal Issue (f63a1a8ef2449986): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.620Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.741Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.815Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:37:01.836Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:38:37.837Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:38:37.881Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:38:37.915Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-05-13_05_29_37-516065198538251450 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 3886.676s
FAILED (SKIP=7, errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_54-6691575010704690333?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_29_04-6905818381407209859?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_37_27-14068304121544280204?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_45_31-13986356613772838745?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_53_00-6068890652075435656?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_01_21-878996813861506671?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_08_44-105438885225091356?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_12_01-16067180792617505480?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_26_06-9672443891814443414?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_35_15-13987130924263912203?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_43_38-10448652607247610940?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_51_03-6249599347374386428?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_59_21-8113733080291097800?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_53-5454015827545421042?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_30_45-8027662430202832033?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_38_50-4761544314294275817?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_47_09-15575753961779490530?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_54_52-2656288957199947494?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_02_21-8291307040337619450?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_57-5094300183262624825?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_24_13-16196030141671553008?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_31_31-9782391864305226854?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_39_34-9264285366515303484?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_47_27-6796415866049038732?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_54_51-12075494169016743896?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_55-13437552988320405344?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_20_24-8171833736580897744?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_29_37-516065198538251450?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_39_05-5916370799581990699?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_46_35-9620815662513001891?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_54_49-5797535143631262650?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_55-14523206402976865453?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_19_41-14673773916840745631?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_28_24-6870706485986577824?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_37_33-15723318494277446481?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_45_54-9992692915969327367?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_54_08-291052049195349811?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_02_08-5501409837816648845?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_55-7212525218148001114?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_20_58-13761003048446169101?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_29_37-5589464505816237483?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_37_25-10443042057273536253?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_44_51-12437842369403185874?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_52_05-13659119915646167839?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_59_34-6453876597634314727?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_11_55-1263608590886432886?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_21_54-5084683002309879016?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_32_14-2282286726107252958?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_40_38-14618178997966704418?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_57_53-16618423831621263631?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
line: 131
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:postCommitPy2IT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
line: 50
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 116
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 6m 38s
123 actionable tasks: 97 executed, 24 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hs54r2jgaqwdu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]