See <https://builds.apache.org/job/beam_PostCommit_Python2/1439/display/redirect?page=changes>
Changes: [dcavazos] [BEAM-7390] Add code snippet for Min [dcavazos] [BEAM-7390] Add code snippet for Sum [ehudm] Light cleanup of opcodes.py [pabloem] [BEAM-7390] Add code snippet for Top (#10179) [bhulette] [BEAM-8993] [SQL] MongoDB predicate push down. (#10417) ------------------------------------------ [...truncated 6.84 MB...] from hamcrest.library.number.ordering_comparison import greater_than File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module> from hamcrest.library import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module> from hamcrest.library.object import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module> from .hasproperty import has_properties, has_property File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174 ), ^ SyntaxError: invalid syntax apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:43.950Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute op.start() File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start super(DoOperation, self).start() File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start def start(self): File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start self.setup() File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads return load(file, ignore, **kwds) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load return Unpickler(file, ignore=ignore, **kwds).load() File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load obj = StockUnpickler.load(self) File "/usr/lib/python2.7/pickle.py", line 864, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module> from hamcrest.library.number.ordering_comparison import greater_than File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module> from hamcrest.library import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module> from hamcrest.library.object import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module> from .hasproperty import has_properties, has_property File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174 ), ^ SyntaxError: invalid syntax apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:47.072Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute op.start() File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start super(DoOperation, self).start() File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start def start(self): File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start self.setup() File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads return load(file, ignore, **kwds) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load return Unpickler(file, ignore=ignore, **kwds).load() File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load obj = StockUnpickler.load(self) File "/usr/lib/python2.7/pickle.py", line 864, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module> from hamcrest.library.number.ordering_comparison import greater_than File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module> from hamcrest.library import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module> from hamcrest.library.object import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module> from .hasproperty import has_properties, has_property File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174 ), ^ SyntaxError: invalid syntax apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.188Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute op.start() File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start super(DoOperation, self).start() File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start def start(self): File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start self.setup() File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads return load(file, ignore, **kwds) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load return Unpickler(file, ignore=ignore, **kwds).load() File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load obj = StockUnpickler.load(self) File "/usr/lib/python2.7/pickle.py", line 864, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module> from hamcrest.library.number.ordering_comparison import greater_than File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module> from hamcrest.library import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module> from hamcrest.library.object import * File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module> from .hasproperty import has_properties, has_property File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174 ), ^ SyntaxError: invalid syntax apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.222Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.289Z: JOB_MESSAGE_DEBUG: Executing failure step failure12 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.311Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: beamapp-jenkins-011319510-01131151-ltuz-harness-q6sr Root cause: Work item failed., beamapp-jenkins-011319510-01131151-ltuz-harness-q6sr Root cause: Work item failed., beamapp-jenkins-011319510-01131151-ltuz-harness-q6sr Root cause: Work item failed., beamapp-jenkins-011319510-01131151-ltuz-harness-q6sr Root cause: Work item failed. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.416Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.479Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:55:50.504Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:57:13.052Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:57:13.103Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T19:57:13.143Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-13_11_51_19-5466023971455048033 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 50 tests in 3440.208s FAILED (SKIP=7, errors=7) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_15_01-12459152256976480713?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_31_09-7035905525680593147?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_37_52-18255204637306960297?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_44_26-15940827395872006023?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_51_14-3367530587196242841?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_58_31-18418318540210942010?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_12_05_10-12954034268037053581?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_15_04-16031719307772545710?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_29_25-13697450128903894126?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_37_02-5240798605001417002?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_44_13-6356757152636993938?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_51_47-17200462273242229197?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_15_01-11912589824153612762?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_34_43-18218613021839553065?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_15_04-10915848553560317392?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_27_20-7602282736469965719?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_33_55-7940566145952596940?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_40_28-9499860399186560346?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_47_02-763041119977741150?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_14_59-6416084083453882066?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_22_50-3256875860216815260?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_30_27-1742400895398716071?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_38_16-345517021469069037?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_44_55-8181798480909726123?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_51_19-5466023971455048033?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_15_02-5915451319465813484?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_23_00-6672404178269956367?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_30_10-602012261661603677?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_36_54-5484436870668064892?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_43_33-4818438698483161606?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_50_37-11488473432743090845?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_14_59-16397873057272835562?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_22_10-7281564190529937963?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_30_30-16632091830028184205?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_37_24-1080431149316616992?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_43_50-13721994593979761507?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_50_39-6679909790718083168?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_15_00-15756958311829527942?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_23_33-8646353362631294382?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_33_14-1818191167159628050?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_41_12-14385330979574484093?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_11_48_27-8876440945173763899?project=apache-beam-testing > Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 58m 43s 121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date Publishing build scan... https://gradle.com/s/6ltmsvlmpzsxg Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org