See 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/4888/display/redirect?page=changes>

Changes:

[github] [BEAM-8803] BigQuery Streaming Inserts are always retried by default.


------------------------------------------
[...truncated 587.42 KB...]
copying apache_beam/transforms/stats_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/timeutil.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/transforms_keyword_only_args_test_py3.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/trigger_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/userstate_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/util_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/window_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/transforms/write_ptransform_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/transforms
copying apache_beam/typehints/__init__.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/decorators_test_py3.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/native_type_compatibility_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/opcodes.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/schemas.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/schemas_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/trivial_inference_test_py3.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typecheck.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typed_pipeline_test_py3.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/typehints/typehints_test_py3.py -> 
apache-beam-2.18.0.dev0/apache_beam/typehints
copying apache_beam/utils/__init__.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/annotations_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.pxd -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/counters.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/counters_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/plugin.py -> apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/processes.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/processes_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/profiler.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/proto_utils.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/retry.py -> apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/retry_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/subprocess_server.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/thread_pool_executor.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/thread_pool_executor_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py -> 
apache-beam-2.18.0.dev0/apache_beam/utils
Writing apache-beam-2.18.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.18.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz

# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
  --attr ValidatesContainer \
  --nologcapture \
  --processes=1 \
  --process-timeout=900 \
  --with-xunitmp \
  --xunitmp-file=$XUNIT_FILE \
  --ignore-files '.*py3\d?\.py$' \
  --test-pipeline-options=" \
    --runner=TestDataflowRunner \
    --project=$PROJECT \
    --worker_harness_container_image=$CONTAINER:$TAG \
    --staging_location=$GCS_LOCATION/staging-validatesrunner-test \
    --temp_location=$GCS_LOCATION/temp-validatesrunner-test \
    --output=$GCS_LOCATION/output \
    --sdk_location=$SDK_LOCATION \
    --num_workers=1"
setup.py:198: UserWarning: You are using Apache Beam with Python 2. New 
releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:476:
 UserWarning: Normalizing '2.18.0.dev' to '2.18.0.dev0'
  normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:58:
 UserWarning: Installing grpcio-tools is recommended for development.
  warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7. More details about 
Python 2 support in pip, can be found at 
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/__init__.py>:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger 
"apache_beam.io.gcp.datastore.v1.datastoreio"
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
WARNING:apache_beam.runners.interactive.interactive_environment:Interactive 
Beam requires Python 3.5.3+.
WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies 
required for Interactive Beam PCollection visualization are not available, 
please use: `pip install apache-beam[interactive]` to install necessary 
dependencies to enable all data visualization features.
WARNING:apache_beam.runners.interactive.interactive_environment:You cannot use 
Interactive Beam features when you are not in an interactive environment such 
as a Jupyter notebook or ipython terminal.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:517:
 YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as 
the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full 
details.
  for spec in yaml.load_all(open(transcript_filename)):
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ERROR
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok

======================================================================
ERROR: test_wordcount_fnapi_it 
(apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py";,>
 line 52, in test_wordcount_fnapi_it
    self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py";,>
 line 85, in _run_wordcount_it
    save_main_session=False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 117, in run
    result = p.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 416, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 66, in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1442, in wait_until_finish
    (self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error 
received from SDK harness for instruction -912: Traceback (most recent call 
last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 136, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 286, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-894'

        at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        at 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        at 
org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:330)
        at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        at 
org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
        at 
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
        at 
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
        at 
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
        Suppressed: java.lang.IllegalStateException: Already closed.
                at 
org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:93)
                at 
org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:220)
                at 
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
                ... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for 
instruction -912: Traceback (most recent call last):
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 136, in _execute
    response = task()
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 154, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 286, in do_instruction
    request.instruction_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 306, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File 
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
 line 239, in get
    self.fns[bundle_descriptor_id],
KeyError: u'-894'

        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
        at 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        at 
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

-------------------- >> begin captured stdout << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-24_14_03_56-11495168417214764130?project=apache-beam-testing

--------------------- >> end captured stdout << ----------------------

----------------------------------------------------------------------
XML: nosetests-python2.7_sdk.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 685.587s

FAILED (errors=1)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191124-215611
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:f53f040e707358794b9d9007df022faa27b7101a74f9cd75783f6a4ec462149c
Deleted: sha256:18a131f201a8f07c31c9b388061a9dc65b9e2dc1ce3b3bd86597062be6773f68
Deleted: sha256:7362f1b3f051562c2676ea0686f0fbcb26f034d5ca1fba1cffa6cc91cef53191
Deleted: sha256:ad7c9272509daa34ebec13ea6e3fe44569c88f535edb15bb4dbb19ddec7ff318
Deleted: sha256:52e0ded2170b274ab905ae29894a600284ee554e5a4a9abc0f0a1bc6f266cf03
Deleted: sha256:f5c72d0e47692a26d5a49612976f4d2ac886fb9b23dc522bff19f673baea187b
Deleted: sha256:5dbe73e48869a7c58175f3acf22172c07352751e28d64f156d882c1305229bdb
Deleted: sha256:8a7f2dde585e5f31ada9b888404e29a371809bb34438f8e50dbc19bb08e7d771
Deleted: sha256:1f09f87599a0d3354fe17ae9e768bc7478007d39b727fc3b808641b06b247396
Deleted: sha256:9e9a3e873e9da4b926bd1a893c20bacd2bcb636a95ab00902c2df26756ac47a2
Deleted: sha256:6a2f851f841531cb67e93780d4d9b5410f5755dc283776b00b082957c994e7b8
Deleted: sha256:ea202d550129d5ea27a4487e64e5248c8fb47921ef483efd1282985c03a3ab49
Deleted: sha256:e38fc46b37a5aff9411bec00d896bd798deaed004633bab619a2db8553eb485e
Deleted: sha256:ce11a8a5a327d319ab849ea5f8698e09c621fcd7544fe9f16890073602bb51eb
Deleted: sha256:0b2c0a4170214d66b1fa19d108d1a16d84fd290d16839cb4cf9d118166042cb0
Deleted: sha256:e0b38e7eb556fcda6fe96064526965ccd58833934781235f0b5f37ffe8d3bfde
Deleted: sha256:fff95f45329e04f4cd9587d02d00d2ac4c1bb6aea6de9697a1d49ad832081ed6
Deleted: sha256:6046e0c32683702df91b6833018057c8843f09dfd074386ef2808f2ec3b074e0
Deleted: sha256:ae0ee1dd6610f916fc68d346e311f4cdb813ffc5eff9c094f630c45b060d36f8
Deleted: sha256:a5e84d29155b4f999c45d1d1df06f87279f46332f2bcfe0b9770cbc1c577b229
Deleted: sha256:db8d9605b062bea4abdc84c7b902a418ef4007563ec3bc73d22356ad0ea4b1f6
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:f53f040e707358794b9d9007df022faa27b7101a74f9cd75783f6a4ec462149c
  Associated tags:
 - 20191124-215611
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191124-215611
Deleted [us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk:20191124-215611].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/python2.7_sdk@sha256:f53f040e707358794b9d9007df022faa27b7101a74f9cd75783f6a4ec462149c].
Removed the container
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to