See 
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/1392/display/redirect?page=changes>

Changes:

[ansela] Better name for batch implementation of GroupAlsoByWindow.

[ansela] Implementation of GroupAlsoByWindowViaWindowSet for the Spark runner.

[ansela] Utils for SparkGroupAlsoByWindowViaWindowSet.

[ansela] Refactor translators according to new GroupAlsoByWindow implemenation

[ansela] Fix streaming translation of Flatten and Window, make CreateStream

[ansela] Test triggers, panes and watermarks via CreateStream.

[ansela] Remove streaming tests that were needed before supporting the model.

[ansela] Use TestSparkRunner in tests.

[ansela] Handle test failures in "graceful stop peroid".

[ansela] Serialize state stream with coders for shuffle and checkpointing.

[ansela] Add multi stream and flattened stream tests.

[ansela] Make TimestampTransform Serializable.

[ansela] Batch executions should block without timeout.

[ansela] Tests that can should run with TestSparkRunner.

[ansela] Batch doesn't use checkpoint dir so nothing to clean.

[ansela] Fix runnable-on-service profile in the Spark runner.

[ansela] Build trigger state machine from Runner API Trigger proto directly

[ansela] Use a PipelineRule for test pipelines.

[ansela] Move LateDataUtils and UnsupportedSideInputReader to runners-core.

[ansela] Make CreateStream a TimestampedValue Source.

[ansela] Spark GABWVOB to use UnsupportedSideInputReader.

[ansela] Streaming tests, especially the ones using checkpoints, need a time

[robertwb] Make side inputs a map, rather than embedding the name in the 
message.

------------------------------------------
[...truncated 1.35 MB...]
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa0b: 
2017-03-01T01:00:09.099Z: JOB_MESSAGE_DEBUG: (f1e3853bdcb9595c): Assigning 
stage ids.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa27: 
2017-03-01T01:00:09.127Z: JOB_MESSAGE_DEBUG: (1366d4357648ee13): Executing wait 
step start13
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa2f: 
2017-03-01T01:00:09.135Z: JOB_MESSAGE_DEBUG: (1366d4357648e889): Executing 
operation start
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa31: 
2017-03-01T01:00:09.137Z: JOB_MESSAGE_DEBUG: (a52004149c54dbe8): Executing 
operation side
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa3a: 
2017-03-01T01:00:09.146Z: JOB_MESSAGE_DEBUG: (a52004149c54d927): Value 
"start.out" materialized.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa3c: 
2017-03-01T01:00:09.148Z: JOB_MESSAGE_DEBUG: (1366d4357648e768): Value 
"side.out" materialized.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa43: 
2017-03-01T01:00:09.155Z: JOB_MESSAGE_BASIC: S01: (1366d4357648e647): Executing 
operation ViewAsIterable(side.None)/CreatePCollectionView
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa4d: 
2017-03-01T01:00:09.165Z: JOB_MESSAGE_DEBUG: (1366d4357648edf8): Value 
"ViewAsIterable(side.None)/CreatePCollectionView.out" materialized.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761aa54: 
2017-03-01T01:00:09.172Z: JOB_MESSAGE_BASIC: S02: (1366d4357648e140): Executing 
operation assert_that/Group/Create
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761ab20: 
2017-03-01T01:00:09.376Z: JOB_MESSAGE_DEBUG: (63309dc7a68f0044): Starting 
worker pool setup.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761ab22: 
2017-03-01T01:00:09.378Z: JOB_MESSAGE_BASIC: (63309dc7a68f0826): Starting 1 
workers...
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761ab2e: 
2017-03-01T01:00:09.390Z: JOB_MESSAGE_DEBUG: (1366d4357648e62c): Value 
"assert_that/Group/Session" materialized.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8761ab36: 
2017-03-01T01:00:09.398Z: JOB_MESSAGE_BASIC: S03: (1366d4357648e50b): Executing 
operation 
compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87641394: 
2017-03-01T01:02:47.188Z: JOB_MESSAGE_DETAILED: (43562705ab083211): Workers 
have started successfully.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87652925: 
2017-03-01T01:03:58.245Z: JOB_MESSAGE_ERROR: (11c7c605f760509c): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 544, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in 
dataflow_worker.executor.Operation.output (dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in 
apache_beam.runners.common.DoFnRunner.receive 
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in 
apache_beam.runners.common.DoFnRunner.reraise_augmented 
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in 
apache_beam.runners.common.DoFnRunner._dofn_invoker 
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in 
apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker 
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in 
apache_beam.runners.common.DoFnRunner._process_outputs 
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in 
dataflow_worker.executor.ConsumerSet.update_counters_start 
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in 
dataflow_worker.opcounters.OperationCounters.update_from 
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in 
dataflow_worker.opcounters.OperationCounters.do_sample 
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in 
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in 
apache_beam.coders.stream.OutputStream.__cinit__ 
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has 
no attribute 'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87653579: 
2017-03-01T01:04:01.401Z: JOB_MESSAGE_ERROR: (11c7c605f76059dd): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 544, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in 
dataflow_worker.executor.Operation.output (dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in 
apache_beam.runners.common.DoFnRunner.receive 
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in 
apache_beam.runners.common.DoFnRunner.reraise_augmented 
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in 
apache_beam.runners.common.DoFnRunner._dofn_invoker 
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in 
apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker 
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in 
apache_beam.runners.common.DoFnRunner._process_outputs 
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in 
dataflow_worker.executor.ConsumerSet.update_counters_start 
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in 
dataflow_worker.opcounters.OperationCounters.update_from 
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in 
dataflow_worker.opcounters.OperationCounters.do_sample 
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in 
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in 
apache_beam.coders.stream.OutputStream.__cinit__ 
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has 
no attribute 'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a876541f9: 
2017-03-01T01:04:04.601Z: JOB_MESSAGE_ERROR: (11c7c605f760531e): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 544, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in 
dataflow_worker.executor.Operation.output (dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in 
apache_beam.runners.common.DoFnRunner.receive 
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in 
apache_beam.runners.common.DoFnRunner.reraise_augmented 
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in 
apache_beam.runners.common.DoFnRunner._dofn_invoker 
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in 
apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker 
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in 
apache_beam.runners.common.DoFnRunner._process_outputs 
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in 
dataflow_worker.executor.ConsumerSet.update_counters_start 
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in 
dataflow_worker.opcounters.OperationCounters.update_from 
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in 
dataflow_worker.opcounters.OperationCounters.do_sample 
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in 
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in 
apache_beam.coders.stream.OutputStream.__cinit__ 
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has 
no attribute 'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87654e40: 
2017-03-01T01:04:07.744Z: JOB_MESSAGE_ERROR: (11c7c605f7605c5f): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 544, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in 
dataflow_worker.executor.ReadOperation.start (dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in 
dataflow_worker.executor.Operation.output (dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in 
dataflow_worker.executor.DoOperation.process (dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in 
apache_beam.runners.common.DoFnRunner.receive 
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in 
apache_beam.runners.common.DoFnRunner.reraise_augmented 
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in 
apache_beam.runners.common.DoFnRunner.process 
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in 
apache_beam.runners.common.DoFnRunner._dofn_invoker 
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in 
apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker 
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in 
apache_beam.runners.common.DoFnRunner._process_outputs 
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in 
dataflow_worker.executor.ConsumerSet.receive (dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in 
dataflow_worker.executor.ConsumerSet.update_counters_start 
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in 
dataflow_worker.opcounters.OperationCounters.update_from 
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in 
dataflow_worker.opcounters.OperationCounters.do_sample 
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in 
apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in 
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
 (apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in 
apache_beam.coders.stream.OutputStream.__cinit__ 
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has 
no attribute 'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87654ee1: 
2017-03-01T01:04:07.905Z: JOB_MESSAGE_DEBUG: (1366d4357648e004): Executing 
failure step failure12
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87654ee3: 
2017-03-01T01:04:07.907Z: JOB_MESSAGE_ERROR: (1366d4357648e8d6): Workflow 
failed. Causes: (1366d4357648e58e): 
S03:compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write
 failed.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87654f1d: 
2017-03-01T01:04:07.965Z: JOB_MESSAGE_DETAILED: (f1e3853bdcb95239): Cleaning up.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87654f22: 
2017-03-01T01:04:07.970Z: JOB_MESSAGE_DEBUG: (f1e3853bdcb959a7): Starting 
worker pool teardown.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a87654f24: 
2017-03-01T01:04:07.972Z: JOB_MESSAGE_BASIC: (f1e3853bdcb95115): Stopping 
worker pool...
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a8766609a: 
2017-03-01T01:05:17.978Z: JOB_MESSAGE_BASIC: (f1e3853bdcb95c3a): Worker pool 
stopped.
root: INFO: 2017-02-28_17_00_06-11812911643974787378_0000015a876660ba: 
2017-03-01T01:05:18.010Z: JOB_MESSAGE_DEBUG: (f1e3853bdcb95284): Tearing down 
pending resources...
root: INFO: Job 2017-02-28_17_00_06-11812911643974787378 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 4123.313s

FAILED (errors=13)
Build step 'Execute shell' marked build as failure

Reply via email to