See 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/228/display/redirect?page=changes>

Changes:

[github] [BEAM-4752] Add support for newer dill dependency (#5931)

------------------------------------------
[...truncated 215.76 KB...]

        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:55)
        at 
com.google.cloud.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:274)
        at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:83)
        at 
com.google.cloud.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:101)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:393)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:362)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
        at 
com.google.cloud.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:179)
        at 
com.google.cloud.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:107)
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction -62: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 134, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 169, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 215, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 234, in process_bundle
    self.data_channel_factory)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 230, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 272, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)])
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 255, in get_operation
    in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 254, in <dictcomp>
    for tag, pcoll_id
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 255, in get_operation
    in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 254, in <dictcomp>
    for tag, pcoll_id
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 258, in get_operation
    transform_id, transform_consumers)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 362, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 499, in create
    serialized_fn, parameter.side_inputs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 537, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 
238, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 277, in loads
    return load(file)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 266, in load
    obj = pik.load()
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1096, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 423, in 
find_class
    return StockUnpickler.find_class(self, module, name)
  File "/usr/local/lib/python2.7/pickle.py", line 1130, in find_class
    __import__(module)
ImportError: No module named _dill

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:142)
        at io.grpc.stub.ServerCalls$2$1.onMessage(ServerCalls.java:204)
        at 
io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:48)
        at 
io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:91)
        at 
io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messageRead(ServerCallImpl.java:242)
        at 
io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1.runInContext(ServerImpl.java:568)
        at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:52)
        at 
io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:152)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

root: INFO: 2018-07-13T18:11:18.758Z: JOB_MESSAGE_ERROR: 
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error 
received from SDK harness for instruction -83: Traceback (most recent call 
last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 134, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 169, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 215, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 234, in process_bundle
    self.data_channel_factory)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 230, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 272, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)])
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 255, in get_operation
    in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 254, in <dictcomp>
    for tag, pcoll_id
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 255, in get_operation
    in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 254, in <dictcomp>
    for tag, pcoll_id
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 258, in get_operation
    transform_id, transform_consumers)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 362, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 499, in create
    serialized_fn, parameter.side_inputs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 537, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 
238, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 277, in loads
    return load(file)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 266, in load
    obj = pik.load()
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1096, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 423, in 
find_class
    return StockUnpickler.find_class(self, module, name)
  File "/usr/local/lib/python2.7/pickle.py", line 1130, in find_class
    __import__(module)
ImportError: No module named _dill

        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:55)
        at 
com.google.cloud.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:274)
        at 
com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:83)
        at 
com.google.cloud.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:101)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:393)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:362)
        at 
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:290)
        at 
com.google.cloud.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:179)
        at 
com.google.cloud.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:107)
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction -83: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 134, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 169, in <lambda>
    self._execute(lambda: worker.do_instruction(work), work)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 215, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 234, in process_bundle
    self.data_channel_factory)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 230, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 272, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)])
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 255, in get_operation
    in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 254, in <dictcomp>
    for tag, pcoll_id
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 255, in get_operation
    in descriptor.transforms[transform_id].outputs.items()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 254, in <dictcomp>
    for tag, pcoll_id
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 207, in wrapper
    result = cache[args] = func(*args)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 258, in get_operation
    transform_id, transform_consumers)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 362, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 499, in create
    serialized_fn, parameter.side_inputs)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
 line 537, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/internal/pickler.py", line 
238, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 277, in loads
    return load(file)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 266, in load
    obj = pik.load()
  File "/usr/local/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/local/lib/python2.7/pickle.py", line 1096, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/site-packages/dill/dill.py", line 423, in 
find_class
    return StockUnpickler.find_class(self, module, name)
  File "/usr/local/lib/python2.7/pickle.py", line 1130, in find_class
    __import__(module)
ImportError: No module named _dill

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:158)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:142)
        at io.grpc.stub.ServerCalls$2$1.onMessage(ServerCalls.java:204)
        at 
io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:48)
        at 
io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:91)
        at 
io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messageRead(ServerCallImpl.java:242)
        at 
io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1.runInContext(ServerImpl.java:568)
        at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:52)
        at 
io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:152)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

root: INFO: 2018-07-13T18:11:18.831Z: JOB_MESSAGE_DEBUG: Executing failure step 
failure12
root: INFO: 2018-07-13T18:11:18.876Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: 
S02:read/Read+split+pair_with_one+group_and_sum/GroupByKey/Reify+group_and_sum/GroupByKey/Write
 failed., A work item was attempted 4 times without success. Each time the 
worker eventually lost contact with the service. The work item was attempted 
on: 
  beamapp-jenkins-071318095-07131109-lzib-harness-rg86,
  beamapp-jenkins-071318095-07131109-lzib-harness-rg86,
  beamapp-jenkins-071318095-07131109-lzib-harness-rg86,
  beamapp-jenkins-071318095-07131109-lzib-harness-rg86
root: INFO: 2018-07-13T18:11:19.020Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-07-13T18:11:19.093Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2018-07-13T18:11:19.134Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-07-13T18:12:43.277Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Resized worker pool from 1 to 0.
root: INFO: 2018-07-13T18:12:43.320Z: JOB_MESSAGE_DETAILED: Autoscaling: Would 
further reduce the number of workers but reached the minimum number allowed for 
the job.
root: INFO: 2018-07-13T18:12:43.394Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2018-07-13_11_09_53-16174310987713656279 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 183.976s

FAILED (errors=1)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_09_53-16174310987713656279?project=apache-beam-testing.
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20180713-180021
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:724bea7d48fa37d04893facddff2ab799bf5fe2c0d3a4cde879b4c0ddb6c4853
Deleted: sha256:90bb8cf46b4ebaf019c3c95dc324e56b4739f1150520cba88d8a80f9315f7506
Deleted: sha256:dbd00f8d6b698305f76ad97f8c80bb5943ae77ccd262ada13fc28a8ff231f2cd
Deleted: sha256:f72dbd86ba71110cca00dd8737043c97dc643315cb8c483dc7060735506aaba9
Deleted: sha256:6d75e9d5ad72634a90381c632cda4485372e546f1903402cbd9a8d93cd933f04
Deleted: sha256:650c07a081a9b86704bf11c6494f85177f73a6909bae51c388e00e6ee1d697ed
Deleted: sha256:98238d6b5fac168fbe959f39ead3e0220f3784c7a91ce81b03e60839ea44f187
Deleted: sha256:9877022cc7a395bfd938ce0e07d30f42aee239e16f3d0332ae98a40875c0e1ff
Deleted: sha256:3d5103ec888af7a4e3ad912bfe82eb8231504191df9db9537163ad2954240c70
Deleted: sha256:0733186ebb9ab3f5ec287a54452cc831f611a9ee509301e97b48523565babad3
Deleted: sha256:52267b913709f213a6da6a77242ac0f2195d71f69107badfd7e8ce3a60c9bf2b
Deleted: sha256:211c4edb744be588415e3f9510cffe805f63a29e96d993b56f9beb108f6acc84
Deleted: sha256:1aebcf310c6ffef727fa70c425026ed1968ac646c95a209d141f513aa4ae95eb
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/python@sha256:724bea7d48fa37d04893facddff2ab799bf5fe2c0d3a4cde879b4c0ddb6c4853
  Associated tags:
 - 20180713-180021
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20180713-180021
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20180713-180021].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/python@sha256:724bea7d48fa37d04893facddff2ab799bf5fe2c0d3a4cde879b4c0ddb6c4853].
Removed the container
Build step 'Execute shell' marked build as failure

Reply via email to