See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4477/display/redirect?page=changes>
Changes: [Pablo] Fixing check for sideinput_io_metrics experiment flag. [iemejia] Remove testing package-info from main package for GCP IO [iemejia] Update maven failsafe/surefire plugin to version 2.21.0 [iemejia] [BEAM-3873] Update commons-compress to version 1.16.1 (fix [iemejia] Remove maven warnings [tgroh] Add Side Inputs to ExecutableStage ------------------------------------------ [...truncated 1.12 MB...] stageStates: [] steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2018-03-21_15_53_16-214525430219642121] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-21_15_53_16-214525430219642121?project=apache-beam-testing root: INFO: Job 2018-03-21_15_53_16-214525430219642121 is in state JOB_STATE_PENDING root: INFO: 2018-03-21T22:53:16.074Z: JOB_MESSAGE_WARNING: Job 2018-03-21_15_53_16-214525430219642121 might autoscale up to 250 workers. root: INFO: 2018-03-21T22:53:16.111Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-21_15_53_16-214525430219642121. The number of workers will be between 1 and 250. root: INFO: 2018-03-21T22:53:16.139Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-21_15_53_16-214525430219642121. root: INFO: 2018-03-21T22:53:19.047Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled. root: INFO: 2018-03-21T22:53:19.158Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. root: INFO: 2018-03-21T22:53:19.459Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2018-03-21T22:53:19.475Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2018-03-21T22:53:19.513Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner. root: INFO: 2018-03-21T22:53:19.549Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. root: INFO: 2018-03-21T22:53:19.571Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2018-03-21T22:53:19.605Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. root: INFO: 2018-03-21T22:53:19.649Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2018-03-21T22:53:19.678Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T22:53:19.710Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T22:53:19.732Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T22:53:19.756Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1 into write/Write/WriteImpl/Extract root: INFO: 2018-03-21T22:53:19.789Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split root: INFO: 2018-03-21T22:53:19.829Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one root: INFO: 2018-03-21T22:53:19.859Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn) root: INFO: 2018-03-21T22:53:19.889Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair root: INFO: 2018-03-21T22:53:19.916Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format root: INFO: 2018-03-21T22:53:19.948Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles root: INFO: 2018-03-21T22:53:19.972Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read root: INFO: 2018-03-21T22:53:20.009Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow root: INFO: 2018-03-21T22:53:20.051Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count root: INFO: 2018-03-21T22:53:20.085Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify root: INFO: 2018-03-21T22:53:20.118Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read root: INFO: 2018-03-21T22:53:20.151Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow root: INFO: 2018-03-21T22:53:20.187Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify root: INFO: 2018-03-21T22:53:20.220Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read root: INFO: 2018-03-21T22:53:20.248Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T22:53:20.281Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T22:53:20.315Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T22:53:20.349Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/PreFinalize/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T22:53:20.380Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T22:53:20.415Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/MapToVoidKey0 into write/Write/WriteImpl/InitializeWrite root: INFO: 2018-03-21T22:53:20.441Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read root: INFO: 2018-03-21T22:53:20.468Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2 into write/Write/WriteImpl/PreFinalize/PreFinalize root: INFO: 2018-03-21T22:53:20.498Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2 into write/Write/WriteImpl/PreFinalize/PreFinalize root: INFO: 2018-03-21T22:53:20.527Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2018-03-21T22:53:20.547Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2018-03-21T22:53:20.572Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2018-03-21T22:53:20.597Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2018-03-21T22:53:20.742Z: JOB_MESSAGE_DEBUG: Executing wait step start26 root: INFO: 2018-03-21T22:53:20.794Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0 root: INFO: 2018-03-21T22:53:20.817Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create root: INFO: 2018-03-21T22:53:20.828Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2018-03-21T22:53:20.849Z: JOB_MESSAGE_BASIC: Executing operation group/Create root: INFO: 2018-03-21T22:53:20.864Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... root: INFO: Job 2018-03-21_15_53_16-214525430219642121 is in state JOB_STATE_RUNNING root: INFO: 2018-03-21T22:53:28.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T22:53:28.696Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized. root: INFO: 2018-03-21T22:53:28.719Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized. root: INFO: 2018-03-21T22:53:28.779Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write root: INFO: 2018-03-21T22:53:33.908Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 1 to 2. root: INFO: 2018-03-21T22:53:56.213Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T22:53:56.237Z: JOB_MESSAGE_DETAILED: Resized worker pool to 1, though goal was 2. This could be a quota issue. root: INFO: 2018-03-21T22:54:12.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T22:55:37.405Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2018-03-21T22:58:25.993Z: JOB_MESSAGE_BASIC: Executing operation group/Close root: INFO: 2018-03-21T22:58:31.577Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized. root: INFO: 2018-03-21T22:58:31.605Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/MapToVoidKey0.out" materialized. root: INFO: 2018-03-21T22:58:31.640Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0.out" materialized. root: INFO: 2018-03-21T22:58:31.663Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/MapToVoidKey0.out" materialized. root: INFO: 2018-03-21T22:58:31.700Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0) root: INFO: 2018-03-21T22:58:31.723Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0) root: INFO: 2018-03-21T22:58:31.760Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0) root: INFO: 2018-03-21T22:58:31.791Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized. root: INFO: 2018-03-21T22:58:31.836Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized. root: INFO: 2018-03-21T22:58:31.893Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized. root: INFO: 2018-03-21T22:58:31.953Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write root: INFO: 2018-03-21T22:58:34.462Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:37.866Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:41.271Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:44.338Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:44.657Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:45.729Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:48.038Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start self.dofn_runner = common.DoFnRunner( File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__ self.do_fn_invoker = DoFnInvoker.create_invoker( File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker return PerWindowInvoker( File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__ input_args, input_kwargs, [si[global_window] for si in side_inputs]) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__ self._cache[window] = self._view_class._from_runtime_iterable( AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable' root: INFO: 2018-03-21T22:58:48.087Z: JOB_MESSAGE_DEBUG: Executing failure step failure25 root: INFO: 2018-03-21T22:58:48.109Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S08:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: beamapp-jenkins-032122531-03211553-9d33-harness-8sg5, beamapp-jenkins-032122531-03211553-9d33-harness-8sg5, beamapp-jenkins-032122531-03211553-9d33-harness-8sg5, beamapp-jenkins-032122531-03211553-9d33-harness-8sg5 root: INFO: 2018-03-21T22:58:48.231Z: JOB_MESSAGE_DETAILED: Cleaning up. root: INFO: 2018-03-21T22:58:48.279Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. root: INFO: 2018-03-21T22:58:48.312Z: JOB_MESSAGE_BASIC: Stopping worker pool... root: INFO: 2018-03-21T23:00:20.567Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2018-03-21T23:00:20.897Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... root: INFO: Job 2018-03-21_15_53_16-214525430219642121 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 3 tests in 488.818s FAILED (errors=1) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-21_15_53_15-7902778553015328114?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-21_15_53_14-15407973245667696338?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-21_15_53_16-214525430219642121?project=apache-beam-testing Build step 'Execute shell' marked build as failure Not sending mail to unregistered user [email protected] Not sending mail to unregistered user [email protected]
