See
<https://builds.apache.org/job/beam_PostCommit_Python2/1388/display/redirect?page=changes>
Changes:
[tvalentyn] [BEAM-9062] Improve assertion error for equal_to (#10504)
[chamikara] [BEAM-8960]: Add an option for user to opt out of using insert id
for
------------------------------------------
[...truncated 6.21 MB...]
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:48.332Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 647, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 649, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 650, in
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 259, in
apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 264, in
apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 595, in
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 600, in
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
288, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in
loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
line 24, in <module>
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:51.486Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 647, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 649, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 650, in
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 259, in
apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 264, in
apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 595, in
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 600, in
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
288, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in
loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
line 24, in <module>
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:54.677Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 647, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 647, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 649, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 650, in
apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 259, in
apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 264, in
apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 595, in
apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 600, in
apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
288, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in
loads
return load(file, ignore, **kwds)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
obj = StockUnpickler.load(self)
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in
_import_module
return getattr(__import__(module, None, None, [obj]), obj)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",
line 24, in <module>
from hamcrest.library.number.ordering_comparison import greater_than
File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2,
in <module>
from hamcrest.library import *
File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py",
line 7, in <module>
from hamcrest.library.object import *
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py",
line 4, in <module>
from .hasproperty import has_properties, has_property
File
"/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py",
line 174
),
^
SyntaxError: invalid syntax
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:54.711Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:54.792Z:
JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:54.827Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. For more
information, see https://cloud.google.com/dataflow/docs/guides/common-errors.
The work item was attempted on these workers:
beamapp-jenkins-010723574-01071557-efyx-harness-ljxj
Root cause: Work item failed.,
beamapp-jenkins-010723574-01071557-efyx-harness-ljxj
Root cause: Work item failed.,
beamapp-jenkins-010723574-01071557-efyx-harness-ljxj
Root cause: Work item failed.,
beamapp-jenkins-010723574-01071557-efyx-harness-ljxj
Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:54.956Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:55.044Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:02:55.076Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:04:21.990Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:04:22.079Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-08T00:04:22.116Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-01-07_15_57_56-10686799352219695957 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3346.256s
FAILED (SKIP=7, errors=7)
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_40-1624538148651984039?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_35_31-14760068470412295977?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_42_45-12040022690678608211?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_50_20-6722422963823606310?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_58_27-6824435062791671681?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_35-13571115710961930758?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_39_20-5193659183186692160?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_47_41-17386769872504919027?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_55_01-13112028378049015999?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_39-2316248591188645436?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_34_09-6403562742971955069?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_40_56-4864750668912513975?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_47_24-13101875624794161714?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_54_22-8343304255835759556?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_37-8676743175487258865?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_37_16-1217528974880528194?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_43_23-9386089178942556120?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_50_20-11573556783565272060?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_58_01-13757557212183181826?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_16_04_26-16529118781647361728?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_16_10_27-10588255875882055024?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_37-1936174245964620213?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_28_50-17420104748072797991?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_36_18-8570121229325419522?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_42_59-4365169451590902465?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_50_22-14875714616603645623?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_57_27-7647583409196612489?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_34-190085547762373409?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_28_15-9911845093575407302?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_36_29-7733352014564253483?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_42_45-11115864600859328011?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_49_39-6086795430692331000?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_57_14-13761770540644114076?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_37-2069785250763764650?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_29_16-691872949103256633?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_36_40-10959984580236189272?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_43_52-11892983736006318100?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_50_49-18307542038745557834?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_57_56-10686799352219695957?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_21_38-2757446576412452489?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_29_55-9945295860792739074?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-07_15_39_55-4046454789183453022?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 85
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 57m 2s
121 actionable tasks: 97 executed, 21 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/kunhwof4wjxkq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]