See
<https://builds.apache.org/job/beam_PostCommit_Python36/2428/display/redirect?page=changes>
Changes:
[mxm] [BEAM-9164] Re-enable UnboundedSourceWrapper#testWatermarkEmission test
[github] Merge pull request #11673 from [BEAM-9967] Adding support for BQ labels
[github] [BEAM-9622] Add Python SqlTransform test that joins tagged PCollections
------------------------------------------
[...truncated 12.15 MB...]
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python3.6/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python3.6/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:03.497Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 246, in wrapper
sleep_interval = next(retry_intervals)
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python3.6/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python3.6/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:27.696Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 246, in wrapper
sleep_interval = next(retry_intervals)
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python3.6/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python3.6/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "/usr/local/lib/python3.6/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.6/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.6/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:27.983Z:
JOB_MESSAGE_BASIC: Finished operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:28.073Z:
JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:28.110Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., Internal Issue (6e363d7f1f97b2a4): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:28.212Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:28.341Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:28.430Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:28.468Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:35:00.843Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:35:00.920Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:35:00.961Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-05-12_17_25_58-7178370031757576485 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_05-6232969502210731184?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_21_49-6675965101470535628?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_29_34-10299226034345754326?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_37_31-14077589991502102305?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_45_12-5077571977519532054?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_53_40-5291676326328088680?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_02-9065541080035856933?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_19_03-8390090983294239532?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_27_18-15694342715768805693?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_36_22-1940767956844036651?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_45_08-14929057893494032543?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_53_04-8886862321948595256?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_00-3703816969310314447?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_26_09-5462169008059404699?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_34_34-3472362715188003781?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_42_13-11618192972912873626?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_49_38-9527099229003734992?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_57_21-12977177531806340009?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_18_05_21-12864725727890862031?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_00-3250551014546525326?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_27_13-4224181745039763739?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_35_46-9563617312569495101?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_44_50-3195182242115535829?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_52_57-4699260977494009284?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_18_01_28-14832378166495240544?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_01-18445892206533896806?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_15_18-15888293517174642582?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_24_44-2161740391085668288?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_33_05-1812721060887641310?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_42_28-8126067310935867332?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_50_36-13716679400689506472?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_58_57-991939732937365549?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_59-3854326293223166344?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_15_13-10458060750112808995?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_23_51-9661308457043165988?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_33_26-1062665101858886599?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_41_43-5637580826417983195?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_49_59-794964989381535093?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_58_15-17233833482078247128?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_00-13157796651818439909?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_16_46-1729502392352797493?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_25_58-7178370031757576485?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_35_33-3436143222662072582?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_43_53-64142892746839060?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_53_02-8704957485331414309?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_01-2539871204953885375?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_17_18-1188236597477132153?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_28_23-12000734199786952177?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_37_00-2511638981902030357?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_54_19-4980005538926046443?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 4000.257s
FAILED (SKIP=7, errors=1)
> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'>
line: 51
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/py36/build.gradle'>
line: 59
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 56
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 8m 39s
86 actionable tasks: 63 executed, 23 from cache
Publishing build scan...
https://gradle.com/s/qjpevsbdvbdl2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]