See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4472/display/redirect?page=changes>
Changes: [herohde] Add Go support for universal runners, incl Flink [herohde] CR: Fixed comments for job service helper functions [herohde] [BEAM-3893] Add fallback to unauthenticated access for GCS IO [robertwb] [BEAM-2927] Python support for dataflow portable side inputs over Fn API [herohde] CR: fix typo [aaltay] [BEAM-3861] Improve test infra in Python SDK for streaming end-to-end ------------------------------------------ [...truncated 1.12 MB...] steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2018-03-20_20_50_21-2321371576352290335] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_50_21-2321371576352290335?project=apache-beam-testing root: INFO: Job 2018-03-20_20_50_21-2321371576352290335 is in state JOB_STATE_PENDING root: INFO: 2018-03-21T03:50:21.226Z: JOB_MESSAGE_WARNING: Job 2018-03-20_20_50_21-2321371576352290335 might autoscale up to 250 workers. root: INFO: 2018-03-21T03:50:21.255Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-20_20_50_21-2321371576352290335. The number of workers will be between 1 and 250. root: INFO: 2018-03-21T03:50:21.282Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-20_20_50_21-2321371576352290335. root: INFO: 2018-03-21T03:50:25.136Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled. root: INFO: 2018-03-21T03:50:25.433Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. root: INFO: 2018-03-21T03:50:26.351Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2018-03-21T03:50:26.369Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2018-03-21T03:50:26.400Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner. root: INFO: 2018-03-21T03:50:26.429Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. root: INFO: 2018-03-21T03:50:26.452Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2018-03-21T03:50:26.477Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. root: INFO: 2018-03-21T03:50:26.518Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2018-03-21T03:50:26.545Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T03:50:26.566Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T03:50:26.588Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T03:50:26.615Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T03:50:26.641Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split root: INFO: 2018-03-21T03:50:26.664Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one root: INFO: 2018-03-21T03:50:26.691Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn) root: INFO: 2018-03-21T03:50:26.722Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair root: INFO: 2018-03-21T03:50:26.749Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format root: INFO: 2018-03-21T03:50:26.781Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles root: INFO: 2018-03-21T03:50:26.811Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read root: INFO: 2018-03-21T03:50:26.838Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow root: INFO: 2018-03-21T03:50:26.863Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count root: INFO: 2018-03-21T03:50:26.890Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify root: INFO: 2018-03-21T03:50:26.918Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read root: INFO: 2018-03-21T03:50:26.940Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow root: INFO: 2018-03-21T03:50:26.965Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify root: INFO: 2018-03-21T03:50:26.996Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read root: INFO: 2018-03-21T03:50:27.028Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T03:50:27.051Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T03:50:27.063Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T03:50:27.095Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T03:50:27.123Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T03:50:27.149Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T03:50:27.174Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read root: INFO: 2018-03-21T03:50:27.205Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2 into write/Write/WriteImpl/PreFinalize/PreFinalize root: INFO: 2018-03-21T03:50:27.227Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2 into write/Write/WriteImpl/PreFinalize/PreFinalize root: INFO: 2018-03-21T03:50:27.245Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2018-03-21T03:50:27.284Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2018-03-21T03:50:27.315Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2018-03-21T03:50:27.338Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2018-03-21T03:50:27.463Z: JOB_MESSAGE_DEBUG: Executing wait step start26 root: INFO: 2018-03-21T03:50:27.522Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0 root: INFO: 2018-03-21T03:50:27.549Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create root: INFO: 2018-03-21T03:50:27.562Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2018-03-21T03:50:27.577Z: JOB_MESSAGE_BASIC: Executing operation group/Create root: INFO: 2018-03-21T03:50:27.589Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... root: INFO: Job 2018-03-20_20_50_21-2321371576352290335 is in state JOB_STATE_RUNNING root: INFO: 2018-03-21T03:50:35.923Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T03:50:35.984Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized. root: INFO: 2018-03-21T03:50:36.013Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized. root: INFO: 2018-03-21T03:50:36.068Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write root: INFO: 2018-03-21T03:50:41.185Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 1 to 2. root: INFO: 2018-03-21T03:50:58.435Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T03:50:58.463Z: JOB_MESSAGE_DETAILED: Resized worker pool to 1, though goal was 2. This could be a quota issue. root: INFO: 2018-03-21T03:51:13.366Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2018-03-21T03:51:14.303Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T03:55:47.885Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized. root: INFO: 2018-03-21T03:55:47.904Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/MapToVoidKey0.out" materialized. root: INFO: 2018-03-21T03:55:47.925Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0.out" materialized. root: INFO: 2018-03-21T03:55:47.945Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/MapToVoidKey0.out" materialized. root: INFO: 2018-03-21T03:55:47.960Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0) root: INFO: 2018-03-21T03:55:47.979Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0) root: INFO: 2018-03-21T03:55:47.994Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0) root: INFO: 2018-03-21T03:55:48.051Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized. root: INFO: 2018-03-21T03:55:48.075Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized. root: INFO: 2018-03-21T03:55:48.125Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized. root: INFO: 2018-03-21T03:55:53.974Z: JOB_MESSAGE_BASIC: Executing operation group/Close root: INFO: 2018-03-21T03:55:54.027Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write root: INFO: 2018-03-21T03:55:57.356Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:55:58.937Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:56:00.734Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:56:01.376Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:56:03.130Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:56:03.747Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:56:04.494Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 373, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T03:56:04.531Z: JOB_MESSAGE_DEBUG: Executing failure step failure25 root: INFO: 2018-03-21T03:56:04.561Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S08:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: beamapp-jenkins-032103501-03202050-b404-harness-xkqb, beamapp-jenkins-032103501-03202050-b404-harness-xkqb, beamapp-jenkins-032103501-03202050-b404-harness-6n3q, beamapp-jenkins-032103501-03202050-b404-harness-6n3q root: INFO: 2018-03-21T03:56:04.665Z: JOB_MESSAGE_DETAILED: Cleaning up. root: INFO: 2018-03-21T03:56:04.701Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. root: INFO: 2018-03-21T03:56:04.728Z: JOB_MESSAGE_BASIC: Stopping worker pool... root: INFO: 2018-03-21T03:57:33.514Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T03:57:33.554Z: JOB_MESSAGE_BASIC: Worker pool stopped. root: INFO: 2018-03-21T03:57:33.942Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... root: INFO: Job 2018-03-20_20_50_21-2321371576352290335 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 3 tests in 449.219s FAILED (errors=1) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_50_21-2321371576352290335?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_50_21-3434922706874619032?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_50_20-2198175731286185508?project=apache-beam-testing Build step 'Execute shell' marked build as failure Not sending mail to unregistered user hero...@google.com Not sending mail to unregistered user aal...@gmail.com