See
<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1685/display/redirect>
Changes:
------------------------------------------
[...truncated 1.33 MB...]
19/12/04 06:52:25 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:25 INFO sdk_worker.run: SDK Harness waiting for in-flight
requests to complete
19/12/04 06:52:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:25 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:25 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client
hanged up.
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:26 INFO
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService:
GetManifest for __no_artifacts_staged__
19/12/04 06:52:26 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging
client connected.
19/12/04 06:52:26 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:26 INFO sdk_worker_main.start: Status HTTP server running at
localhost:41731
19/12/04 06:52:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:26 WARN sdk_worker_main._load_main_session: No session file
found: /tmp/staged/pickled_main_session. Functions defined in __main__
(interactive session) may fail.
19/12/04 06:52:26 WARN pipeline_options.get_all_options: Discarding unparseable
args:
[u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d',
u'--job_server_timeout=60', u'--pipeline_type_check',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks']
19/12/04 06:52:26 INFO sdk_worker_main.main: Python sdk harness started with
pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type':
u'PROCESS', 'sdk_location': u'container', 'job_name':
u'test_windowing_1575442343.27', 'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint':
u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:26 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:26 INFO sdk_worker.__init__: Creating insecure control channel
for localhost:45521.
19/12/04 06:52:26 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:26 INFO sdk_worker.__init__: Initializing SDKHarness with
unbounded number of workers.
19/12/04 06:52:26 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 263-1
19/12/04 06:52:26 INFO sdk_worker.create_state_handler: Creating insecure state
channel for localhost:45179.
19/12/04 06:52:26 INFO sdk_worker.create_state_handler: State channel
established.
19/12/04 06:52:26 INFO data_plane.create_data_channel: Creating client data
channel for localhost:40911
19/12/04 06:52:26 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.
19/12/04 06:52:26 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/12/04 06:52:26 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:26 INFO sdk_worker.run: SDK Harness waiting for in-flight
requests to complete
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:26 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:26 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client
hanged up.
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService:
GetManifest for __no_artifacts_staged__
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging
client connected.
19/12/04 06:52:27 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:27 INFO sdk_worker_main.start: Status HTTP server running at
localhost:39183
19/12/04 06:52:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:27 WARN sdk_worker_main._load_main_session: No session file
found: /tmp/staged/pickled_main_session. Functions defined in __main__
(interactive session) may fail.
19/12/04 06:52:27 WARN pipeline_options.get_all_options: Discarding unparseable
args:
[u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d',
u'--job_server_timeout=60', u'--pipeline_type_check',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks']
19/12/04 06:52:27 INFO sdk_worker_main.main: Python sdk harness started with
pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type':
u'PROCESS', 'sdk_location': u'container', 'job_name':
u'test_windowing_1575442343.27', 'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint':
u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:27 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:27 INFO sdk_worker.__init__: Creating insecure control channel
for localhost:37041.
19/12/04 06:52:27 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 264-1
19/12/04 06:52:27 INFO sdk_worker.__init__: Initializing SDKHarness with
unbounded number of workers.
19/12/04 06:52:27 INFO sdk_worker.create_state_handler: Creating insecure state
channel for localhost:39987.
19/12/04 06:52:27 INFO sdk_worker.create_state_handler: State channel
established.
19/12/04 06:52:27 INFO data_plane.create_data_channel: Creating client data
channel for localhost:44103
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/12/04 06:52:27 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:27 INFO sdk_worker.run: SDK Harness waiting for in-flight
requests to complete
19/12/04 06:52:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:27 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client
hanged up.
19/12/04 06:52:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService:
GetManifest for __no_artifacts_staged__
19/12/04 06:52:28 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging
client connected.
19/12/04 06:52:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:28 INFO sdk_worker_main.start: Status HTTP server running at
localhost:46015
19/12/04 06:52:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:28 WARN sdk_worker_main._load_main_session: No session file
found: /tmp/staged/pickled_main_session. Functions defined in __main__
(interactive session) may fail.
19/12/04 06:52:28 WARN pipeline_options.get_all_options: Discarding unparseable
args:
[u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d',
u'--job_server_timeout=60', u'--pipeline_type_check',
u'--direct_runner_use_stacked_bundle', u'--spark_master=local',
u'--options_id=30', u'--enable_spark_metric_sinks']
19/12/04 06:52:28 INFO sdk_worker_main.main: Python sdk harness started with
pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'],
'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type':
u'PROCESS', 'sdk_location': u'container', 'job_name':
u'test_windowing_1575442343.27', 'environment_config': u'{"command":
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',>
'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint':
u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:28 INFO sdk_worker.__init__: Creating insecure control channel
for localhost:39091.
19/12/04 06:52:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:28 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 265-1
19/12/04 06:52:28 INFO sdk_worker.__init__: Initializing SDKHarness with
unbounded number of workers.
19/12/04 06:52:28 INFO sdk_worker.create_state_handler: Creating insecure state
channel for localhost:41299.
19/12/04 06:52:28 INFO sdk_worker.create_state_handler: State channel
established.
19/12/04 06:52:28 INFO data_plane.create_data_channel: Creating client data
channel for localhost:46159
19/12/04 06:52:28 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.
19/12/04 06:52:28 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "beam:env:process:v1"
payload:
"\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">
19/12/04 06:52:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:28 INFO sdk_worker.run: SDK Harness waiting for in-flight
requests to complete
19/12/04 06:52:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:28 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:28 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client
hanged up.
19/12/04 06:52:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer:
Hanged up for unknown endpoint.
19/12/04 06:52:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job
test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d finished.
19/12/04 06:52:28 WARN
org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting
monitoring infos is not implemented yet in Spark portable runner.
19/12/04 06:52:28 ERROR
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to
remove job staging directory for token
{"sessionId":"job_bb58eeaa-3c93-4deb-9b1f-b6b968d0f22d","basePath":"/tmp/sparktestDp1ByL"}:
{}
java.io.FileNotFoundException:
/tmp/sparktestDp1ByL/job_bb58eeaa-3c93-4deb-9b1f-b6b968d0f22d/MANIFEST (No such
file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
at
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
at
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
at
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
File "apache_beam/runners/portability/portable_runner_test.py", line 231, in
test_pardo_state_with_custom_key_coder
equal_to(expected))
File "apache_beam/pipeline.py", line 436, in __exit__
self.run().wait_until_finish()
File "apache_beam/runners/portability/portable_runner.py", line 428, in
wait_until_finish
for state_response in self._state_stream:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 395, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 552, in _next
_common.wait(self._state.condition.wait, _response_ready)
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",>
line 140, in wait
_wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",>
line 105, in _wait_once
wait_fn(timeout=timeout)
File "/usr/lib/python2.7/threading.py", line 359, in wait
_sleep(delay)
File "apache_beam/runners/portability/portable_runner_test.py", line 75, in
handler
raise BaseException(msg)
BaseException: Timed out after 60 seconds.
======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in
test_pardo_timers
assert_that(actual, equal_to(expected))
File "apache_beam/pipeline.py", line 436, in __exit__
self.run().wait_until_finish()
File "apache_beam/runners/portability/portable_runner.py", line 428, in
wait_until_finish
for state_response in self._state_stream:
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 395, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 552, in _next
==================== Timed out after 60 seconds. ====================
_common.wait(self._state.condition.wait, _response_ready)
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",>
line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139752301029120)>
_wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 139752292636416)>
File
"<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",>
line 105, in _wait_once
wait_fn(timeout=timeout)
File "/usr/lib/python2.7/threading.py", line 359, in wait
_sleep(delay)
File "apache_beam/runners/portability/portable_runner_test.py", line 75, in
handler
raise BaseException(msg)
BaseException: Timed out after 60 seconds.
# Thread: <_MainThread(MainThread, started 139753088661248)>
==================== Timed out after 60 seconds. ====================
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139752274278144)>
File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in
test_sdf_with_watermark_tracking
assert_that(actual, equal_to(lis# Thread: <Thread(Thread-125, started
daemon 139752282932992)>
t(''.join(data))))
File "apache_beam/pipeline.py", line 436, in __exit__
self.run().wait_until_finish()
File "apache_beam/runners/portability/portable_runner.py", line 438, in
wait_until_finish
self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139753088661248)>
RuntimeError: Pipeline
test_sdf_with_watermark_tracking_1575442334.21_81cdc114-5806-4fc0-b59d-23b2199121a0
failed in state FAILED: java.lang.UnsupportedOperationException: The
ActiveBundle does not have a registered bundle checkpoint handler.
# Thread: <Thread(Thread-119, started daemon 139752292636416)>
# Thread: <Thread(wait_until_finish_read, started daemon 139752301029120)>
----------------------------------------------------------------------
Ran 38 tests in 318.360s
FAILED (errors=3, skipped=9)
> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
line: 196
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 8m 5s
60 actionable tasks: 47 executed, 13 from cache
Publishing build scan...
Publishing failed.
The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the
build scan server.
Your network environment may be interfering, or the service may be unavailable.
If you believe this to be in error, please report this problem via
https://gradle.com/scans/help/plugin and include the following via copy/paste:
----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: a1a7bad1-415c-4610-862c-59b434734332
Response status code: 502
Response content type: text/html; charset=UTF-8
Response server type: cloudflare
----------
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]