[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

2018-08-05 Thread devaraj-kavali
Github user devaraj-kavali commented on a diff in the pull request:

https://github.com/apache/spark/pull/21996#discussion_r207763630
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
* Kill an existing submission using the REST protocol. Standalone and 
Mesos cluster mode only.
*/
   private def kill(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .killSubmission(args.submissionToKill)
+createRestSubmissionClient(args).killSubmission(args.submissionToKill)
   }
 
   /**
* Request the status of an existing submission using the REST protocol.
* Standalone and Mesos cluster mode only.
*/
   private def requestStatus(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .requestSubmissionStatus(args.submissionToRequestStatusFor)
+
createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
+  }
+
+  /**
+   * Creates RestSubmissionClient with overridden logInfo()
+   */
+  private def createRestSubmissionClient(args: SparkSubmitArguments): 
RestSubmissionClient = {
+new RestSubmissionClient(args.master) {
+  override protected def logInfo(msg: => String): Unit = 
printMessage(msg)
--- End diff --

I agree, user can change this log level. If the users configure the log 
level as WARN or above(WARN is the default log level) then they can't see any 
update/status from status and kill commands. I think we cannot expect the user 
to configure the log level to INFO if they want to run status and kill commands 
with status/update. Please let me know if you have any thoughts to fix this 
better, I can make the changes. Thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

2018-08-04 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/21996#discussion_r207717051
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
* Kill an existing submission using the REST protocol. Standalone and 
Mesos cluster mode only.
*/
   private def kill(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .killSubmission(args.submissionToKill)
+createRestSubmissionClient(args).killSubmission(args.submissionToKill)
   }
 
   /**
* Request the status of an existing submission using the REST protocol.
* Standalone and Mesos cluster mode only.
*/
   private def requestStatus(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .requestSubmissionStatus(args.submissionToRequestStatusFor)
+
createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
+  }
+
+  /**
+   * Creates RestSubmissionClient with overridden logInfo()
+   */
+  private def createRestSubmissionClient(args: SparkSubmitArguments): 
RestSubmissionClient = {
+new RestSubmissionClient(args.master) {
+  override protected def logInfo(msg: => String): Unit = 
printMessage(msg)
--- End diff --

this is not necessarily always the case - user can config log level easily?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

2018-08-04 Thread devaraj-kavali
Github user devaraj-kavali commented on a diff in the pull request:

https://github.com/apache/spark/pull/21996#discussion_r207702588
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
* Kill an existing submission using the REST protocol. Standalone and 
Mesos cluster mode only.
*/
   private def kill(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .killSubmission(args.submissionToKill)
+createRestSubmissionClient(args).killSubmission(args.submissionToKill)
   }
 
   /**
* Request the status of an existing submission using the REST protocol.
* Standalone and Mesos cluster mode only.
*/
   private def requestStatus(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .requestSubmissionStatus(args.submissionToRequestStatusFor)
+
createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
+  }
+
+  /**
+   * Creates RestSubmissionClient with overridden logInfo()
+   */
+  private def createRestSubmissionClient(args: SparkSubmitArguments): 
RestSubmissionClient = {
+new RestSubmissionClient(args.master) {
+  override protected def logInfo(msg: => String): Unit = 
printMessage(msg)
--- End diff --

When `isInterpreter = true`(repl shell) the log is initialized and working 
but the log level is setting to WARN due to that `RestSubmissionClient` logInfo 
messages are not showing which are as part of the response. This PR change 
effects when `isInterpreter = true` and for status/kill commands, and doesn't 
change the other behavior.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

2018-08-04 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/21996#discussion_r207702200
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -98,17 +98,24 @@ private[spark] class SparkSubmit extends Logging {
* Kill an existing submission using the REST protocol. Standalone and 
Mesos cluster mode only.
*/
   private def kill(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .killSubmission(args.submissionToKill)
+createRestSubmissionClient(args).killSubmission(args.submissionToKill)
   }
 
   /**
* Request the status of an existing submission using the REST protocol.
* Standalone and Mesos cluster mode only.
*/
   private def requestStatus(args: SparkSubmitArguments): Unit = {
-new RestSubmissionClient(args.master)
-  .requestSubmissionStatus(args.submissionToRequestStatusFor)
+
createRestSubmissionClient(args).requestSubmissionStatus(args.submissionToRequestStatusFor)
+  }
+
+  /**
+   * Creates RestSubmissionClient with overridden logInfo()
+   */
+  private def createRestSubmissionClient(args: SparkSubmitArguments): 
RestSubmissionClient = {
+new RestSubmissionClient(args.master) {
+  override protected def logInfo(msg: => String): Unit = 
printMessage(msg)
--- End diff --

doesn't this change the behavior even when the logger is 
initialized/working?




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21996: [SPARK-24888][CORE] spark-submit --master spark:/...

2018-08-03 Thread devaraj-kavali
GitHub user devaraj-kavali opened a pull request:

https://github.com/apache/spark/pull/21996

[SPARK-24888][CORE] spark-submit --master spark://host:port --status 
driver-id does not work

## What changes were proposed in this pull request?

In `SparkSubmit.scala` (`val uninitLog = initializeLogIfNecessary(true, 
silent = true)`)  -> `Logging.scala` (`val replLevel = 
Option(replLogger.getLevel()).getOrElse(Level.WARN)`), the log level for 
`rootLogger ` is overiri to `WARN ` but the status of the driver and kill 
driver commands status have been logging with `INFO ` log level, so there is 
nothing printing on the console for status and kill commands. This PR overrides 
the `logInfo()` for `RestSubmissionClient` and redirects the msg to 
`printStream`.


## How was this patch tested?

I verified it manually by running status and kill commands, these are the 
results with and without the PR change.

- Without the PR change

```
[user1@user1-work-pc bin]$ ./spark-submit --master 
spark://user1-work-pc:6066 --status driver-20180803165641-
[user1@user1-work-pc bin]$ ./spark-submit --master 
spark://user1-work-pc:6066 --kill driver-20180803165641-
```


- With the PR change

```
[user1@user1-work-pc bin]$ ./spark-submit --master 
spark://user1-work-pc:6066 --kill driver-20180803165641-
Submitting a request to kill submission driver-20180803165641- in 
spark://user1-work-pc:6066.
Server responded with KillSubmissionResponse:
{
  "action" : "KillSubmissionResponse",
  "message" : "Driver driver-20180803165641- has already finished or 
does not exist",
  "serverSparkVersion" : "2.4.0-SNAPSHOT",
  "submissionId" : "driver-20180803165641-",
  "success" : false
}

[user1@user1-work-pc bin]$ ./spark-submit --master 
spark://user1-work-pc:6066 --status driver-20180803165641-
Submitting a request for the status of submission 
driver-20180803165641- in spark://user1-work-pc:6066.
Server responded with SubmissionStatusResponse:
{
  "action" : "SubmissionStatusResponse",
  "driverState" : "FINISHED",
  "serverSparkVersion" : "2.4.0-SNAPSHOT",
  "submissionId" : "driver-20180803165641-",
  "success" : true,
  "workerHostPort" : "xx.x.xx.xxx:42040",
  "workerId" : "worker-20180803165615-10.3.66.149-42040"
}
```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/devaraj-kavali/spark SPARK-24888

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/21996.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #21996


commit 1e8000ffacabac742d16efee72ec1e421225a272
Author: Devaraj K 
Date:   2018-08-04T00:47:23Z

[SPARK-24888][CORE] spark-submit --master spark://host:port --status
driver-id does not work




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org