See
<https://builds.apache.org/job/beam_PostCommit_Python2/1431/display/redirect>
Changes:
------------------------------------------
[...truncated 6.91 MB...]
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:39.609Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 649, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 651, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 652, in
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 261, in
apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 266, in
apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 597, in
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 602, in
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
290, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in
loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
line 26, in <module>
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:42.762Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 649, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 651, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 652, in
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 261, in
apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 266, in
apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 597, in
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 602, in
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
290, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in
loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
line 26, in <module>
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:45.885Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 649, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 651, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 652, in
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 261, in
apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 266, in
apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 597, in
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 602, in
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
290, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in
loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
line 26, in <module>
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:45.909Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:45.983Z:
JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:46.009Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. For more
information, see https://cloud.google.com/dataflow/docs/guides/common-errors.
The work item was attempted on these workers:
beamapp-jenkins-011218405-01121041-9k4z-harness-zzs9
Root cause: Work item failed.,
beamapp-jenkins-011218405-01121041-9k4z-harness-zzs9
Root cause: Work item failed.,
beamapp-jenkins-011218405-01121041-9k4z-harness-zzs9
Root cause: Work item failed.,
beamapp-jenkins-011218405-01121041-9k4z-harness-zzs9
Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:46.126Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:46.192Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:45:46.226Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:47:14.684Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:47:14.728Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-12T18:47:14.767Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-01-12_10_41_11-12694801198564389485 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3451.987s
FAILED (SKIP=7, errors=7)
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_47-7075459393444262316?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_20_04-8577204602499488982?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_27_27-2344956400453111116?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_34_21-7669449527162954009?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_41_57-10601365011796119735?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_45-12195231201029202895?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_23_58-11266063220037089521?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_46-11823807459349445349?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_17_34-11216916637936549671?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_23_32-9073223419225887930?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_30_00-8331244783993286745?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_36_27-14228475860446623015?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_44-6129875654658510408?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_22_18-9697292349933211522?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_29_00-17358093690779596047?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_36_06-11033230424607463861?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_42_53-5034303433448207000?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_49_48-17843961057350258583?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_56_10-10579682606483788135?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_44-76288528710021843?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_12_56-17321324229293905206?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_20_50-4872901823174348520?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_27_17-4106552826896812269?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_33_45-13850909500172482378?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_40_29-6982668335953133336?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_44-5674127805991648013?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_12_53-13551352766818977034?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_21_30-18354538572848589808?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_28_18-7172663595862382549?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_34_45-1095069307492985404?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_41_11-12694801198564389485?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_45-12278585167996551144?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_13_22-13896476963204342615?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_20_32-14880009890539510800?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_27_18-14470051869686750170?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_34_21-2240462767409183118?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_40_45-11021783351409456988?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_05_45-11311996664312913675?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_13_42-9842113019740689796?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_23_00-18039229896735721972?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_30_48-10079204471032002189?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-12_10_37_45-10408408034069779169?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 85
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 58m 45s
121 actionable tasks: 96 executed, 22 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/xo3sgutfif6cq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]