See <https://builds.apache.org/job/beam_PostCommit_PythonVerify/565/changes>
Changes: [robertwb] Windowed side input test. [robertwb] Implement windowed side inputs for direct runner. [robertwb] Fix tests expecting list from AsIter. [robertwb] Implement windowed side inputs for InProcess runner. [robertwb] More complicated window tests. [robertwb] Optimize globally windowed side input case [robertwb] Minor fixups for better testing [robertwb] Rename from_iterable to avoid confusion. ------------------------------------------ [...truncated 2889 lines...] }, { "@type": "FastPrimitivesCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "TimestampCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlwhmbmpxSWJuQXOID5XIYNmYyFjbSFTkh4ANWETWg==", "component_encodings": [] }, { "@type": "SingletonCoder$<string of 252 bytes>", "component_encodings": [] } ], "is_wrapper": true }, "output_name": "out", "user_name": "write/WriteImpl/finalize_write.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s7" }, "serialized_fn": "<string of 1292 bytes>", "user_name": "write/WriteImpl/finalize_write" } } ], "type": "JOB_TYPE_BATCH" } INFO:root:Create job: <Job id: u'2016-10-18_13_48_20-13116714018908792973' projectId: u'apache-beam-testing' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> INFO:root:Created job with id: [2016-10-18_13_48_20-13116714018908792973] INFO:root:To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2016-10-18_13_48_20-13116714018908792973 INFO:root:Job 2016-10-18_13_48_20-13116714018908792973 is in state JOB_STATE_RUNNING INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d55e6: 2016-10-18T20:48:20.966Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade31d): Checking required Cloud APIs are enabled. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5700: 2016-10-18T20:48:21.248Z: JOB_MESSAGE_DEBUG: (cc0f1c724dadedf5): Combiner lifting skipped for step write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5702: 2016-10-18T20:48:21.250Z: JOB_MESSAGE_DEBUG: (cc0f1c724dade8ab): Combiner lifting skipped for step group: GroupByKey not followed by a combiner. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5704: 2016-10-18T20:48:21.252Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade361): Expanding GroupByKey operations into optimizable parts. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5707: 2016-10-18T20:48:21.255Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee17): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d570e: 2016-10-18T20:48:21.262Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee39): Annotating graph with Autotuner information. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d573d: 2016-10-18T20:48:21.309Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade911): Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5740: 2016-10-18T20:48:21.312Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade3c7): Fusing consumer split into read INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5743: 2016-10-18T20:48:21.315Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee7d): Fusing consumer group/Reify into pair_with_one INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5746: 2016-10-18T20:48:21.318Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade933): Fusing consumer format into count INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5749: 2016-10-18T20:48:21.321Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade3e9): Fusing consumer write/WriteImpl/GroupByKey/GroupByWindow into write/WriteImpl/GroupByKey/Read INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d574b: 2016-10-18T20:48:21.323Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee9f): Fusing consumer write/WriteImpl/GroupByKey/Write into write/WriteImpl/GroupByKey/Reify INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5752: 2016-10-18T20:48:21.330Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade40b): Fusing consumer write/WriteImpl/FlatMap(<lambda at iobase.py:758>) into write/WriteImpl/GroupByKey/GroupByWindow INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5755: 2016-10-18T20:48:21.333Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeec1): Fusing consumer count into group/GroupByWindow INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5758: 2016-10-18T20:48:21.336Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade977): Fusing consumer write/WriteImpl/WindowInto into write/WriteImpl/Map(<lambda at iobase.py:755>) INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d575b: 2016-10-18T20:48:21.339Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade42d): Fusing consumer write/WriteImpl/GroupByKey/Reify into write/WriteImpl/WindowInto INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d575d: 2016-10-18T20:48:21.341Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeee3): Fusing consumer write/WriteImpl/Map(<lambda at iobase.py:755>) into write/WriteImpl/write_bundles INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5760: 2016-10-18T20:48:21.344Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade999): Fusing consumer pair_with_one into split INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5762: 2016-10-18T20:48:21.346Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade44f): Fusing consumer group/GroupByWindow into group/Read INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5765: 2016-10-18T20:48:21.349Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadef05): Fusing consumer write/WriteImpl/write_bundles into format INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d576a: 2016-10-18T20:48:21.354Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade9bb): Fusing consumer group/Write into group/Reify INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57c1: 2016-10-18T20:48:21.441Z: JOB_MESSAGE_DEBUG: (cc0f1c724dade07b): Workflow config is missing a default resource spec. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57c3: 2016-10-18T20:48:21.443Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeb31): Adding StepResource setup and teardown to workflow graph. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57f8: 2016-10-18T20:48:21.496Z: JOB_MESSAGE_DEBUG: (f4efa5dde1605941): Adding workflow start and stop steps. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5802: 2016-10-18T20:48:21.506Z: JOB_MESSAGE_DEBUG: (19e2a43502bd1fce): Assigning stage ids. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5865: 2016-10-18T20:48:21.605Z: JOB_MESSAGE_DEBUG: (16a2eef936374c8e): Executing wait step start2 INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5870: 2016-10-18T20:48:21.616Z: JOB_MESSAGE_DEBUG: (16a2eef936374351): Executing operation write/WriteImpl/DoOnce INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5879: 2016-10-18T20:48:21.625Z: JOB_MESSAGE_BASIC: S01: (16a2eef93637483b): Executing operation group/Create INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d587d: 2016-10-18T20:48:21.629Z: JOB_MESSAGE_DEBUG: (b9ce855eb405fb82): Value "write/WriteImpl/DoOnce.out" materialized. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5895: 2016-10-18T20:48:21.653Z: JOB_MESSAGE_BASIC: S04: (cc0f1c724dade09d): Executing operation write/WriteImpl/initialize_write INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5944: 2016-10-18T20:48:21.828Z: JOB_MESSAGE_DEBUG: (3cceca373b352626): Starting worker pool setup. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5946: 2016-10-18T20:48:21.830Z: JOB_MESSAGE_BASIC: (3cceca373b352a50): Starting 1 workers... INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5959: 2016-10-18T20:48:21.849Z: JOB_MESSAGE_DEBUG: (decbc9edbb3cc0f3): Value "group/Session" materialized. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5964: 2016-10-18T20:48:21.860Z: JOB_MESSAGE_BASIC: S02: (37d85b62a91607ee): Executing operation read+split+pair_with_one+group/Reify+group/Write INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98f0022: 2016-10-18T20:50:10.082Z: JOB_MESSAGE_DETAILED: (9b51a8a70519ded7): Workers have started successfully. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbebc: 2016-10-18T20:50:58.876Z: JOB_MESSAGE_DEBUG: (decbc9edbb3cc8a0): Value "write/WriteImpl/initialize_write.out" materialized. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbec7: 2016-10-18T20:50:58.887Z: JOB_MESSAGE_BASIC: S05: (f4efa5dde1605ffb): Executing operation write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbf21: 2016-10-18T20:50:58.977Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd6fa8): Value "write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView.out" materialized. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc5a8: 2016-10-18T20:51:00.648Z: JOB_MESSAGE_BASIC: S03: (b9ce855eb405f56a): Executing operation group/Close INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc5c1: 2016-10-18T20:51:00.673Z: JOB_MESSAGE_BASIC: S06: (37d85b62a916051e): Executing operation write/WriteImpl/GroupByKey/Create INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc671: 2016-10-18T20:51:00.849Z: JOB_MESSAGE_DEBUG: (16a2eef9363745c1): Value "write/WriteImpl/GroupByKey/Session" materialized. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc683: 2016-10-18T20:51:00.867Z: JOB_MESSAGE_BASIC: S07: (848e4970f9b90b1f): Executing operation group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc847: 2016-10-18T20:51:01.319Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4db6): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc8ff: 2016-10-18T20:51:01.503Z: JOB_MESSAGE_ERROR: (f57f9314dc1c434d): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc9b0: 2016-10-18T20:51:01.680Z: JOB_MESSAGE_ERROR: (f57f9314dc1c48e4): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fca71: 2016-10-18T20:51:01.873Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4e7b): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcb5d: 2016-10-18T20:51:02.109Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4412): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcc0e: 2016-10-18T20:51:02.286Z: JOB_MESSAGE_ERROR: (f57f9314dc1c49a9): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcced: 2016-10-18T20:51:02.509Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4f40): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcd93: 2016-10-18T20:51:02.675Z: JOB_MESSAGE_ERROR: (f57f9314dc1c44d7): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdbc: 2016-10-18T20:51:02.716Z: JOB_MESSAGE_DEBUG: (b8f75ec1d16f6d1e): Executing failure step failure1 INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdbe: 2016-10-18T20:51:02.718Z: JOB_MESSAGE_ERROR: (b8f75ec1d16f6950): Workflow failed. Causes: (848e4970f9b90f7d): S07:group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write failed. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdf2: 2016-10-18T20:51:02.770Z: JOB_MESSAGE_DETAILED: (41f77b7ab7cd6989): Cleaning up. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fce79: 2016-10-18T20:51:02.905Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd6cd4): Starting worker pool teardown. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fce7c: 2016-10-18T20:51:02.908Z: JOB_MESSAGE_BASIC: (41f77b7ab7cd6f06): Stopping worker pool... INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d99168ad: 2016-10-18T20:52:47.917Z: JOB_MESSAGE_BASIC: (41f77b7ab7cd6251): Worker pool stopped. INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d9916cab: 2016-10-18T20:52:48.939Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd67ce): Tearing down pending resources... INFO:root:Job 2016-10-18_13_48_20-13116714018908792973 is in state JOB_STATE_FAILED Traceback (most recent call last): File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main "__main__", fname, loader, pkg_name) File "/usr/lib/python2.7/runpy.py", line 72, in _run_code exec code in run_globals File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 107, in <module> run() File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 98, in run result = p.run() File "apache_beam/pipeline.py", line 159, in run return self.runner.run(self) File "apache_beam/runners/dataflow_runner.py", line 188, in run % getattr(self, 'last_error_msg', None), self.result) apache_beam.runners.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed: (f57f9314dc1c44d7): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work work_executor.execute() File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416) op.start() File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278) def start(self): File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093) self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292) for side_input in side_inputs] File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804) has_default, default = view_options ValueError: need more than 1 value to unpack # Grep will exit with status 1 if success message was not found. echo ">>> CHECKING JOB SUCCESS" >>> CHECKING JOB SUCCESS grep JOB_STATE_DONE job_output Build step 'Execute shell' marked build as failure