See 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/4875/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8658] [BEAM-8781] Optionally set jar and artifact staging port 
in

[kcweaver] Pass artifact port to FlinkJarJobServer as well.

[kcweaver] Move FlinkRunnerOptions to pipeline_options.

[kcweaver] [BEAM-8796] Optionally configure static ports for job and expansion

[robertwb] [BEAM-8802] Don't clear watermark hold when adding elements.

[lukecwik] [BEAM-7948] Add time-based cache threshold support in the Java data 
s…


------------------------------------------
[...truncated 601.67 KB...]
Searching for decorator
Reading https://pypi.org/simple/decorator/
Downloading 
https://files.pythonhosted.org/packages/8f/b7/f329cfdc75f3d28d12c65980e4469e2fa373f1953f5df6e370e84ea2e875/decorator-4.4.1-py2.py3-none-any.whl#sha256=5d19b92a3c8f7f101c8dd86afd86b0f061a8ce4540ab8cd401fa2542756bce6d
Best match: decorator 4.4.1
Processing decorator-4.4.1-py2.py3-none-any.whl
Installing decorator-4.4.1-py2.py3-none-any.whl to 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/decorator-4.4.1-py2.7.egg>
Searching for backports.shutil-get-terminal-size
Reading https://pypi.org/simple/backports.shutil-get-terminal-size/
Downloading 
https://files.pythonhosted.org/packages/7d/cd/1750d6c35fe86d35f8562091737907f234b78fdffab42b29c72b1dd861f4/backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl#sha256=0975ba55054c15e346944b38956a4c9cbee9009391e41b86c68990effb8c1f64
Best match: backports.shutil-get-terminal-size 1.0.0
Processing backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl
Installing backports.shutil_get_terminal_size-1.0.0-py2.py3-none-any.whl to 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/backports.shutil_get_terminal_size-1.0.0-py2.7.egg>
Searching for ipython-genutils
Reading https://pypi.org/simple/ipython-genutils/
Downloading 
https://files.pythonhosted.org/packages/fa/bc/9bd3b5c2b4774d5f33b2d544f1460be9df7df2fe42f352135381c347c69a/ipython_genutils-0.2.0-py2.py3-none-any.whl#sha256=72dd37233799e619666c9f639a9da83c34013a73e8bbc79a7a6348d93c61fab8
Best match: ipython-genutils 0.2.0
Processing ipython_genutils-0.2.0-py2.py3-none-any.whl
Installing ipython_genutils-0.2.0-py2.py3-none-any.whl to 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/ipython_genutils-0.2.0-py2.7.egg>
Searching for ptyprocess>=0.5
Reading https://pypi.org/simple/ptyprocess/
Downloading 
https://files.pythonhosted.org/packages/d1/29/605c2cc68a9992d18dada28206eeada56ea4bd07a239669da41674648b6f/ptyprocess-0.6.0-py2.py3-none-any.whl#sha256=d7cc528d76e76342423ca640335bd3633420dc1366f258cb31d05e865ef5ca1f
Best match: ptyprocess 0.6.0
Processing ptyprocess-0.6.0-py2.py3-none-any.whl
Installing ptyprocess-0.6.0-py2.py3-none-any.whl to 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs>

Installed 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/.eggs/ptyprocess-0.6.0-py2.7.egg>
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/__init__.py>:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger 
"apache_beam.io.gcp.datastore.v1.datastoreio"
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
WARNING:apache_beam.runners.interactive.interactive_environment:Interactive 
Beam requires Python 3.5.3+.
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies 
required for Interactive Beam PCollection visualization are not available, 
please use: `pip install apache-beam[interactive]` to install necessary 
dependencies to enable all data visualization features.
WARNING:apache_beam.runners.interactive.interactive_environment:You cannot use 
Interactive Beam features when you are not in an interactive environment such 
as a Jupyter notebook or ipython terminal.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:517:
 YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as 
the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full 
details.
  for spec in yaml.load_all(open(transcript_filename)):
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ERROR
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ERROR

======================================================================
ERROR: test_wordcount_fnapi_it 
(apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py";,>
 line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py";,>
 line 85, in _run_wordcount_it
    save_main_session=False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 117, in run
    result = p.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 416, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 66, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1442, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error 
received from SDK harness for instruction -213: Traceback (most recent call 
last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 136, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 286, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-187'

        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        at 
org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:330)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        at 
org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
        at 
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
        at 
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
        Suppressed: java.lang.IllegalStateException: Already closed.
                at 
org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:93)
                at 
org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:220)
                at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
                ... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction -213: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 136, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 286, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-187'

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-22_12_37_34-17614507885784414116?project=apache-beam-testing

--------------------- >> end captured stdout << ----------------------

======================================================================
ERROR: test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py";,>
 line 56, in test_metrics_fnapi_it
    result = self.run_pipeline(experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py";,>
 line 44, in run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py";,>
 line 180, in apply_and_run
    result = pipeline.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 416, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 66, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1442, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error 
received from SDK harness for instruction -491: Traceback (most recent call 
last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 136, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 286, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-481'

        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        at 
org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:330)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        at 
org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
        at 
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
        at 
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
        Suppressed: java.lang.IllegalStateException: Already closed.
                at 
org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:93)
                at 
org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:220)
                at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
                ... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction -491: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 136, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 286, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-481'

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-22_12_42_18-11058187271518768742?project=apache-beam-testing

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: nosetests-python2.7_sdk.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 600.613s

FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191122-202551
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:cf9660be5279cdc859d67640fdffbb3e1950392f7b8c054ad101394f1c86e52f
Deleted: sha256:7d9a2df67539f3741ddd393bcc295999cb280bda6d77de198330036191531a85
Deleted: sha256:be6aed257a6d75aa90c834a3a3fed0f84907c5a2fecd56a60cd02716b49758bc
Deleted: sha256:10942180c41ff8dc324ce4a307103b37ce15cd8da8a5b311b68bc21814ca5d04
Deleted: sha256:dd337528325f5e32d584ac32f6c0f125dd4ba613952fe29efa93373d7f640763
Deleted: sha256:89ab0a3bbfc95cdfbc308a377bf76f0009cd79d9d6a0fe23d0fd02dd36c38607
Deleted: sha256:314293f83635f25bfedea3018f0fda8135b40719116d7e15b2596389f4475b24
Deleted: sha256:094993439b296a9131db859f05953239d320067411f92c2fe97c29eb6661803e
Deleted: sha256:fa9864f016c02409e4e6596d249cd39b685ef051a678783b6da0ef2cd5446f41
Deleted: sha256:8f5fa3000e57b73423f6da19bee7d92f80e8e00643f11d0d91f9dfcc6f50766f
Deleted: sha256:2944c19ee6b97719581811c3ec72f2db51a7d9c4c0f0d1a9f9fef761f2349c1c
Deleted: sha256:6f58cd2d4190825fb410e43ceea3055af40c8186e62f21b5d702a559384873da
Deleted: sha256:d7f0582f6471445aa1995e563c0e334718b8b6e435f9bf94693d48bacfb9fe02
Deleted: sha256:6561b38fa1b2d5da7486ce0b54e91e99c396dcaf3ac75f1180feb7f2f688c1b9
Deleted: sha256:63cb27f5e802f64726f33ca5cd85e9f61f286fe5dc4e8dafbc9e0f961949555d
Deleted: sha256:8f2c976c8630a6b7c13bcbcb50206a01e2f8da062b8b0bb6595c576f711c1e66
Deleted: sha256:e7d9e042d61f4dab851a183836468f2decd36b33d1cab482b1ecfcf4c23d1e90
Deleted: sha256:b9e93404f1d5706aac3f00ddac6e5dc5b2ea7a31fa1ad45f891c421653a66d58
Deleted: sha256:b238a4d002ff1eda200823fe7f48b5b562b3f8ecb283b7094a7faf835c28071a
Deleted: sha256:ba77f2b2575472b9e232040425b9b014cf368bb72e646cad0df842114f9cca6a
Deleted: sha256:59cc7d9df9fa0425fe7853e64fd87685f75558053c8b06391a115654d7c8a678
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:cf9660be5279cdc859d67640fdffbb3e1950392f7b8c054ad101394f1c86e52f
  Associated tags:
 - 20191122-202551
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191122-202551
Deleted [us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191122-202551].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:cf9660be5279cdc859d67640fdffbb3e1950392f7b8c054ad101394f1c86e52f].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to