Github user tnachen commented on a diff in the pull request:
https://github.com/apache/spark/pull/21027#discussion_r208468305
--- Diff:
core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionServer.scala ---
@@ -63,6 +63,8 @@ private[spark] abstract class RestSubmissionServer(
s"$baseContext/create/*" -> submitRequestServlet,
s"$baseContext/kill/*" -> killRequestServlet,
s"$baseContext/status/*" -> statusRequestServlet,
+ "/health" -> new ServerStatusServlet(this),
--- End diff --
This impacts the Rest submission server in general too, I do like the idea
to provide an endpoint to get status but I'm not sure this is a paradigm that
Spark is going for. I know the common pattern is to poll Spark metrics to
understand status of the components. @felixcheung do you have thoughts around
this?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]