See
<https://builds.apache.org/job/beam_PostCommit_Python37/2340/display/redirect?page=changes>
Changes:
[mxm] [BEAM-9164] Re-enable UnboundedSourceWrapper#testWatermarkEmission test
[github] Merge pull request #11673 from [BEAM-9967] Adding support for BQ labels
[github] [BEAM-9622] Add Python SqlTransform test that joins tagged PCollections
------------------------------------------
[...truncated 12.24 MB...]
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:10.655Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 246, in wrapper
sleep_interval = next(retry_intervals)
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:37.148Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 246, in wrapper
sleep_interval = next(retry_intervals)
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 226, in execute
self._split_task)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 234, in _perform_source_split_considering_api_limits
desired_bundle_size)
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 271, in _perform_source_split
for split in source.split(desired_bundle_size):
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 698, in split
self.table_reference = self._execute_query(bq)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/options/value_provider.py",
line 135, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery.py",
line 744, in _execute_query
job_labels=self.bigquery_job_labels)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 249, in wrapper
raise_with_traceback(exn, exn_traceback)
File "/usr/local/lib/python3.7/site-packages/future/utils/__init__.py", line
446, in raise_with_traceback
raise exc.with_traceback(traceback)
File "/usr/local/lib/python3.7/site-packages/apache_beam/utils/retry.py",
line 236, in wrapper
return fun(*args, **kwargs)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/bigquery_tools.py",
line 415, in _start_query_job
labels=job_labels or {},
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 973, in __setattr__
object.__setattr__(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 1651, in __set__
value = t(**value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 791, in __init__
setattr(self, name, value)
File
"/usr/local/lib/python3.7/site-packages/apitools/base/protorpclite/messages.py",
line 976, in __setattr__
"to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:37.472Z:
JOB_MESSAGE_BASIC: Finished operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:37.559Z:
JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:37.595Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., Internal Issue (1014e6c6ce203a7a): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:37.728Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:37.884Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:38.027Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:38.069Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:35:22.931Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:35:22.995Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:35:23.037Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-05-12_17_26_18-9445801295698277572 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_50-13909974446828616361?project=apache-beam-testing
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_22_16-2231408444091982188?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_30_42-317972399226545788?project=apache-beam-testing
----------------------------------------------------------------------
Ran 58 tests in 4010.719s
FAILED (SKIP=7, errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_39_03-3897911026392178116?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_47_10-1534145555358504365?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_57_21-7547805449401278049?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_46-13849491744272860657?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_26_55-5534262355398813431?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_35_05-8932269429022930565?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_43_22-3329481441383333171?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_52_23-12533411123001577439?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_18_00_58-1574065692897645714?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_49-430500153985345550?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_20_16-9183598564677583508?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_27_50-3348243577112096072?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_36_56-8498638854604971707?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_45_36-16085605021066527251?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_53_41-11438236284705699833?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_46-9227434239238273659?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_25_30-3016680511990181796?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_33_20-15374413238019954084?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_41_42-2914849347848530362?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_49_43-5021870218097806969?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_58_00-11574986626996741934?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_48-5569584330705077010?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_16_17-16018781853731370116?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_26_24-575904726446705198?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_34_57-11170615731438570758?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_43_05-6077929818892854971?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_50_47-16071723688564799171?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_58_59-10634726752014040425?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_18_06_48-2209218691592827250?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_46-15465270074709318326?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_15_55-2837209126643690870?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_24_34-17907618522709562389?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_34_08-1195484427852833934?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_43_15-9363986994199940796?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_51_20-16678256489775280800?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_59_21-1091964887408485576?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_47-4769174293695732562?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_17_10-14683099251452591522?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_26_18-9445801295698277572?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_35_54-12156223399710748915?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_44_33-4391744406673639908?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_52_44-13657428561710245134?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_07_49-10364327905295235165?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_18_43-2609649914716756014?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_29_29-11821702884261459092?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_37_37-9707577317858755788?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_57_00-8603124245182544632?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/py37/build.gradle'>
line: 59
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
line: 51
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 120
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 8m 34s
87 actionable tasks: 64 executed, 23 from cache
Publishing build scan...
https://gradle.com/s/ff6bqiratrw44
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]