See
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/3714/display/redirect?page=changes>
Changes:
[cyturel] [BEAM-7683] - fix withQueryFn when split is more than 0
------------------------------------------
[...truncated 207.57 KB...]
copying apache_beam/utils/retry_test.py ->
apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp.py ->
apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/timestamp_test.py ->
apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/urns.py -> apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.pxd ->
apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value.py ->
apache-beam-2.15.0.dev0/apache_beam/utils
copying apache_beam/utils/windowed_value_test.py ->
apache-beam-2.15.0.dev0/apache_beam/utils
Writing apache-beam-2.15.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'apache-beam-2.15.0.dev0' (and everything under it)
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
find dist/apache-beam-*.tar.gz
# Run ValidatesRunner tests on Google Cloud Dataflow service
echo ">>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST"
>>> RUNNING DATAFLOW RUNNER VALIDATESCONTAINER TEST
python setup.py nosetests \
--attr ValidatesContainer \
--nologcapture \
--processes=1 \
--process-timeout=900 \
--test-pipeline-options=" \
--runner=TestDataflowRunner \
--project=$PROJECT \
--worker_harness_container_image=$CONTAINER:$TAG \
--staging_location=$GCS_LOCATION/staging-validatesrunner-test \
--temp_location=$GCS_LOCATION/temp-validatesrunner-test \
--output=$GCS_LOCATION/output \
--sdk_location=$SDK_LOCATION \
--num_workers=1"
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/container/venv/python/local/lib/python2.7/site-packages/setuptools/dist.py>:472:
UserWarning: Normalizing '2.15.0.dev' to '2.15.0.dev0'
normalized_version,
running nosetests
running egg_info
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/gen_protos.py>:57:
UserWarning: Installing grpcio-tools is recommended for development.
warnings.warn('Installing grpcio-tools is recommended for development.')
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
WARNING:root:Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
WARNING:root:python-snappy is not installed; some tests will be skipped.
WARNING:root:Tensorflow is not installed, so skipping some tests.
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ERROR
WARNING:root:Discarding unparseable args:
['--output=gs://temp-storage-for-end-to-end-tests/output']
WARNING:root:Discarding unparseable args:
['--output=gs://temp-storage-for-end-to-end-tests/output']
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ERROR
======================================================================
ERROR: test_wordcount_fnapi_it
(apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",>
line 52, in test_wordcount_fnapi_it
self._run_wordcount_it(wordcount.run, experiment='beam_fn_api')
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",>
line 84, in _run_wordcount_it
run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/examples/wordcount.py",>
line 114, in run
result = p.run()
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",>
line 406, in run
self._options).run(False)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",>
line 419, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",>
line 64, in run_pipeline
self.result.wait_until_finish(duration=wait_duration)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 1338, in wait_until_finish
(self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error
received from SDK harness for instruction -366: Traceback (most recent call
last):
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 157, in _execute
response = task()
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 190, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 342, in do_instruction
request.instruction_id)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 368, in process_bundle
bundle_processor.process_bundle(instruction_id))
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 593, in process_bundle
data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at
org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
at
org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
Suppressed: java.lang.IllegalStateException: Already closed.
at
org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
at
org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for
instruction -366: Traceback (most recent call last):
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 157, in _execute
response = task()
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 190, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 342, in do_instruction
request.instruction_id)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 368, in process_bundle
bundle_processor.process_bundle(instruction_id))
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 593, in process_bundle
data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-344\x12\x04-342'
at
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
-------------------- >> begin captured stdout << ---------------------
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_26_48-15938709806256550563?project=apache-beam-testing.
--------------------- >> end captured stdout << ----------------------
======================================================================
ERROR: test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",>
line 60, in test_metrics_fnapi_it
result = self.run_pipeline(experiment='beam_fn_api')
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py",>
line 48, in run_pipeline
return dataflow_exercise_metrics_pipeline.apply_and_run(p)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py",>
line 197, in apply_and_run
result = pipeline.run()
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",>
line 406, in run
self._options).run(False)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/pipeline.py",>
line 419, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",>
line 64, in run_pipeline
self.result.wait_until_finish(duration=wait_duration)
File
"<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 1338, in wait_until_finish
(self.state, getattr(self._runner, 'last_error_msg', None)), self)
DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error
received from SDK harness for instruction -129: Traceback (most recent call
last):
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 157, in _execute
response = task()
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 190, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 342, in do_instruction
request.instruction_id)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 368, in process_bundle
bundle_processor.process_bundle(instruction_id))
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 593, in process_bundle
data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at
org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:285)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
at
org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:412)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:381)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
Suppressed: java.lang.IllegalStateException: Already closed.
at
org.apache.beam.sdk.fn.data.BeamFnDataBufferingOutboundObserver.close(BeamFnDataBufferingOutboundObserver.java:95)
at
org.apache.beam.runners.dataflow.worker.fn.data.RemoteGrpcPortWriteOperation.abort(RemoteGrpcPortWriteOperation.java:215)
at
org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:91)
... 6 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for
instruction -129: Traceback (most recent call last):
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 157, in _execute
response = task()
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 190, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 342, in do_instruction
request.instruction_id)
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 368, in process_bundle
bundle_processor.process_bundle(instruction_id))
File
"/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 593, in process_bundle
data.ptransform_id].process_encoded(data.data)
KeyError: u'\n\x04-107\x12\x04-105'
at
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at
org.apache.beam.vendor.grpc.v1p13p1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
-------------------- >> begin captured stdout << ---------------------
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_02_33_00-10256505019860755171?project=apache-beam-testing.
--------------------- >> end captured stdout << ----------------------
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 793.960s
FAILED (errors=2)
cleanup_container
Untagged: us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914
Untagged:
us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a
Deleted: sha256:ed0446abf010b5dfb4035282a41d1b0e3938d881f44e5ea52e4ff38479de08f5
Deleted: sha256:358837b7c8131a4e0fc256da3c1172d5df8c96582f921580c24e535931b2aa4b
Deleted: sha256:a427a3ba45224aa880c139ca05ffb8178e22957d8c5f2854b1603d3ef4fec0a2
Deleted: sha256:013a877a236bb64c30270a9386a15947571d4b06be265b5f3b063669542048ae
Deleted: sha256:16f19032bf88cc67b4c7161a80949224e896b37f11dbfe6ffa801c7b30d73c87
Deleted: sha256:595a6ea07b8712f4a61627bdb090b00b52cfb1e3f10e62eb04e4c0752e612835
Deleted: sha256:d1444692228516f4454659bc0e8c4694ac27600f43adc4347e3026754c1ab22e
Deleted: sha256:47e711fab358383c37c26850df37cf3222ed38e6c93c012a9c24d5bbfaa5f926
Deleted: sha256:b4b2c9baedb6b5c96af2254ab1bda17b469e7d84daca151ac4ec9717f06b9614
Deleted: sha256:c220f87ecc0eb530bbb3fb08ffd16c8a94c9d3dd9e5f6a7749c3fcc7892617d4
Deleted: sha256:640a6d5419fc32fdce2381a7dace620cf76cfe8c384251b67a81a3e0a9b4e536
Deleted: sha256:2ed8bf7be26e6ab7a199835b1f925c48280544a628ffed8386b874d60f858df8
Deleted: sha256:f7e60c6b424ef14a8adcc19e84b39f8a19392db0130f06dec0d426ee6da124f1
Deleted: sha256:a0c262d3183cfdc2b442ad1e8e05af479eaf08455639deb563481626ce7c3ce2
Digests:
-
us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a
Associated tags:
- 20190704-091914
Tags:
- us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914
Deleted [us.gcr.io/apache-beam-testing/jenkins/python:20190704-091914].
Deleted
[us.gcr.io/apache-beam-testing/jenkins/python@sha256:51b39b7e448711fd248d98f2a8afd9abba8a5ffbdb710caa0ffd034aebed6a2a].
Removed the container
Build step 'Execute shell' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]