See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1076/>

------------------------------------------
[...truncated 8345 lines...]
                    {
                      "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s16"
        }, 
        "serialized_fn": "<string of 1212 bytes>", 
        "user_name": "assert:even/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 id: u'2017-01-23_19_18_55-6733368171544728562'
 projectId: u'apache-beam-testing'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2017-01-23_19_18_55-6733368171544728562]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-01-23_19_18_55-6733368171544728562
root: INFO: Job 2017-01-23_19_18_55-6733368171544728562 is in state 
JOB_STATE_RUNNING
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bceef: 
2017-01-24T03:18:57.519Z: JOB_MESSAGE_DETAILED: (e11bab53777fc3a3): Checking 
required Cloud APIs are enabled.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd4b2: 
2017-01-24T03:18:58.994Z: JOB_MESSAGE_DEBUG: (e11bab53777fc1b2): Combiner 
lifting skipped for step assert_that/Group: GroupByKey not followed by a 
combiner.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd4b4: 
2017-01-24T03:18:58.996Z: JOB_MESSAGE_DEBUG: (e11bab53777fc860): Combiner 
lifting skipped for step assert:even/Group: GroupByKey not followed by a 
combiner.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd4b6: 
2017-01-24T03:18:58.998Z: JOB_MESSAGE_DEBUG: (e11bab53777fcf0e): Combiner 
lifting skipped for step assert:odd/Group: GroupByKey not followed by a 
combiner.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd4c0: 
2017-01-24T03:18:59.008Z: JOB_MESSAGE_DETAILED: (e11bab53777fc5bc): Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd4d0: 
2017-01-24T03:18:59.024Z: JOB_MESSAGE_DETAILED: (e11bab53777fcc6a): Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd4eb: 
2017-01-24T03:18:59.051Z: JOB_MESSAGE_DETAILED: (e11bab53777fc722): Annotating 
graph with Autotuner information.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd51f: 
2017-01-24T03:18:59.103Z: JOB_MESSAGE_DETAILED: (e11bab53777fc7d5): Fusing 
adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd524: 
2017-01-24T03:18:59.108Z: JOB_MESSAGE_DETAILED: (e11bab53777fc531): Fusing 
consumer assert:odd/ToVoidKey into assert:odd/WindowInto
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd527: 
2017-01-24T03:18:59.111Z: JOB_MESSAGE_DETAILED: (e11bab53777fcbdf): Fusing 
consumer assert:odd/UnKey into assert:odd/Group/GroupByWindow
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd529: 
2017-01-24T03:18:59.113Z: JOB_MESSAGE_DETAILED: (e11bab53777fc28d): Fusing 
consumer assert:even/UnKey into assert:even/Group/GroupByWindow
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd52b: 
2017-01-24T03:18:59.115Z: JOB_MESSAGE_DETAILED: (e11bab53777fc93b): Fusing 
consumer assert:even/Group/GroupByWindow into assert:even/Group/Read
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd52e: 
2017-01-24T03:18:59.118Z: JOB_MESSAGE_DETAILED: (e11bab53777fcfe9): Fusing 
consumer assert_that/Match into assert_that/UnKey
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd530: 
2017-01-24T03:18:59.120Z: JOB_MESSAGE_DETAILED: (e11bab53777fc697): Fusing 
consumer assert_that/UnKey into assert_that/Group/GroupByWindow
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd533: 
2017-01-24T03:18:59.123Z: JOB_MESSAGE_DETAILED: (e11bab53777fcd45): Fusing 
consumer assert_that/Group/GroupByWindow into assert_that/Group/Read
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd535: 
2017-01-24T03:18:59.125Z: JOB_MESSAGE_DETAILED: (e11bab53777fc3f3): Fusing 
consumer assert_that/Group/Write into assert_that/Group/Reify
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd538: 
2017-01-24T03:18:59.128Z: JOB_MESSAGE_DETAILED: (e11bab53777fcaa1): Fusing 
consumer assert_that/Group/Reify into assert_that/ToVoidKey
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd53a: 
2017-01-24T03:18:59.130Z: JOB_MESSAGE_DETAILED: (e11bab53777fc14f): Fusing 
consumer assert_that/ToVoidKey into assert_that/WindowInto
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd53d: 
2017-01-24T03:18:59.133Z: JOB_MESSAGE_DETAILED: (e11bab53777fc7fd): Fusing 
consumer assert:odd/Group/GroupByWindow into assert:odd/Group/Read
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd53f: 
2017-01-24T03:18:59.135Z: JOB_MESSAGE_DETAILED: (e11bab53777fceab): Fusing 
consumer assert:even/Group/Write into assert:even/Group/Reify
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd542: 
2017-01-24T03:18:59.138Z: JOB_MESSAGE_DETAILED: (e11bab53777fc559): Fusing 
consumer assert:even/Match into assert:even/UnKey
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd544: 
2017-01-24T03:18:59.140Z: JOB_MESSAGE_DETAILED: (e11bab53777fcc07): Fusing 
consumer assert:even/Group/Reify into assert:even/ToVoidKey
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd547: 
2017-01-24T03:18:59.143Z: JOB_MESSAGE_DETAILED: (e11bab53777fc2b5): Fusing 
consumer assert:odd/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd549: 
2017-01-24T03:18:59.145Z: JOB_MESSAGE_DETAILED: (e11bab53777fc963): Fusing 
consumer assert:odd/Group/Write into assert:odd/Group/Reify
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd54b: 
2017-01-24T03:18:59.147Z: JOB_MESSAGE_DETAILED: (e11bab53777fc011): Fusing 
consumer assert:even/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd54e: 
2017-01-24T03:18:59.150Z: JOB_MESSAGE_DETAILED: (e11bab53777fc6bf): Fusing 
consumer assert:even/ToVoidKey into assert:even/WindowInto
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd551: 
2017-01-24T03:18:59.153Z: JOB_MESSAGE_DETAILED: (e11bab53777fcd6d): Fusing 
consumer assert_that/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd553: 
2017-01-24T03:18:59.155Z: JOB_MESSAGE_DETAILED: (e11bab53777fc41b): Fusing 
consumer assert:odd/Match into assert:odd/UnKey
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd555: 
2017-01-24T03:18:59.157Z: JOB_MESSAGE_DETAILED: (e11bab53777fcac9): Fusing 
consumer assert:odd/Group/Reify into assert:odd/ToVoidKey
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd5ea: 
2017-01-24T03:18:59.306Z: JOB_MESSAGE_DEBUG: (e11bab53777fc89d): Workflow 
config is missing a default resource spec.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd5f9: 
2017-01-24T03:18:59.321Z: JOB_MESSAGE_DETAILED: (e11bab53777fcf4b): Adding 
StepResource setup and teardown to workflow graph.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd649: 
2017-01-24T03:18:59.401Z: JOB_MESSAGE_DEBUG: (969c07ebca9e3abd): Adding 
workflow start and stop steps.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd68b: 
2017-01-24T03:18:59.467Z: JOB_MESSAGE_DEBUG: (5f455b8a2539c32a): Assigning 
stage ids.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd6ea: 
2017-01-24T03:18:59.562Z: JOB_MESSAGE_DEBUG: (5f455b8a2539ce5b): Executing wait 
step start2
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd6f5: 
2017-01-24T03:18:59.573Z: JOB_MESSAGE_DEBUG: (5f455b8a2539c74f): Executing 
operation Some Numbers
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd701: 
2017-01-24T03:18:59.585Z: JOB_MESSAGE_DEBUG: (306d5741977075a): Value "Some 
Numbers.out" materialized.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd70b: 
2017-01-24T03:18:59.595Z: JOB_MESSAGE_BASIC: S01: (9e50051901c7956e): Executing 
operation assert:odd/Group/Create
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd70d: 
2017-01-24T03:18:59.597Z: JOB_MESSAGE_BASIC: S03: (1d63a3bedcd2bbda): Executing 
operation assert_that/Group/Create
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd714: 
2017-01-24T03:18:59.604Z: JOB_MESSAGE_BASIC: S02: (969c07ebca9e3a81): Executing 
operation assert:even/Group/Create
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd7d7: 
2017-01-24T03:18:59.799Z: JOB_MESSAGE_DEBUG: (6d98c9c703d1e811): Starting 
worker pool setup.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd7d9: 
2017-01-24T03:18:59.801Z: JOB_MESSAGE_BASIC: (6d98c9c703d1e0d7): Starting 1 
workers...
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd7ed: 
2017-01-24T03:18:59.821Z: JOB_MESSAGE_DEBUG: (237b0974c1c0d60e): Value 
"assert:odd/Group/Session" materialized.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd811: 
2017-01-24T03:18:59.857Z: JOB_MESSAGE_DEBUG: (20a78c1d17658ea5): Value 
"assert_that/Group/Session" materialized.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd813: 
2017-01-24T03:18:59.859Z: JOB_MESSAGE_DEBUG: (3450a6e186be988b): Value 
"assert:even/Group/Session" materialized.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7bd81f: 
2017-01-24T03:18:59.871Z: JOB_MESSAGE_BASIC: S04: (20a78c1d17658e73): Executing 
operation 
ClassifyNumbers/ClassifyNumbers+assert:odd/WindowInto+assert:odd/ToVoidKey+assert:even/WindowInto+assert:even/ToVoidKey+assert:even/Group/Reify+assert:even/Group/Write+assert_that/WindowInto+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write+assert:odd/Group/Reify+assert:odd/Group/Write
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7c9807: 
2017-01-24T03:19:48.999Z: JOB_MESSAGE_DETAILED: (61f94197025dddb8): Workers 
have started successfully.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e41a7: 
2017-01-24T03:21:37.959Z: JOB_MESSAGE_ERROR: (65c712345d54e3cf): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 514, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
 line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e427b: 
2017-01-24T03:21:38.171Z: JOB_MESSAGE_ERROR: (65c712345d54e8a4): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 514, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
 line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e4304: 
2017-01-24T03:21:38.308Z: JOB_MESSAGE_ERROR: (65c712345d54e3be): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 514, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
 line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e437c: 
2017-01-24T03:21:38.428Z: JOB_MESSAGE_ERROR: (65c712345d54eed8): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 514, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
 line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e4402: 
2017-01-24T03:21:38.562Z: JOB_MESSAGE_ERROR: (65c712345d54e9f2): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 514, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
 line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e44b7: 
2017-01-24T03:21:38.743Z: JOB_MESSAGE_ERROR: (65c712345d54eec7): Traceback 
(most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 514, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in 
dataflow_worker.executor.MapTaskExecutor.execute 
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in 
dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 
212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in 
_import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
 line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e44e4: 
2017-01-24T03:21:38.788Z: JOB_MESSAGE_DEBUG: (66893aba8d70245e): Executing 
failure step failure1
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e44e7: 
2017-01-24T03:21:38.791Z: JOB_MESSAGE_ERROR: (66893aba8d702dd0): Workflow 
failed. Causes: (20a78c1d1765835c): 
S04:ClassifyNumbers/ClassifyNumbers+assert:odd/WindowInto+assert:odd/ToVoidKey+assert:even/WindowInto+assert:even/ToVoidKey+assert:even/Group/Reify+assert:even/Group/Write+assert_that/WindowInto+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write+assert:odd/Group/Reify+assert:odd/Group/Write
 failed.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e4525: 
2017-01-24T03:21:38.853Z: JOB_MESSAGE_DETAILED: (3450a6e186be9e26): Cleaning up.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e45a2: 
2017-01-24T03:21:38.978Z: JOB_MESSAGE_DEBUG: (3450a6e186be9fc0): Starting 
worker pool teardown.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7e45a4: 
2017-01-24T03:21:38.980Z: JOB_MESSAGE_BASIC: (3450a6e186be915a): Stopping 
worker pool...
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7f7e2c: 
2017-01-24T03:22:58.988Z: JOB_MESSAGE_BASIC: (3450a6e186be93c1): Worker pool 
stopped.
root: INFO: 2017-01-23_19_18_55-6733368171544728562_00000159ce7f8239: 
2017-01-24T03:23:00.025Z: JOB_MESSAGE_DEBUG: (3450a6e186be988f): Tearing down 
pending resources...
root: INFO: Job 2017-01-23_19_18_55-6733368171544728562 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 1050.523s

FAILED (errors=4)
Build step 'Execute shell' marked build as failure

Reply via email to