Github user devaraj-kavali commented on the issue:
https://github.com/apache/spark/pull/13407
Thanks @vanzin and @andrewor14 for looking into this, sorry for the delay.
> If SparkSubmit can still process --kill and --status with those, then
that's fine too (just use SparkLauncher.NO_RESOURCE).
I tried this but it doesn't work with the below error
```
[devaraj@server2 spark-master]$ ./bin/spark-submit --kill
driver-20160531171222-0000
Error: Cannot load main class from JAR spark-internal with URI null. Please
specify a class through --class.
Run with --help for usage help or --verbose for debug output
```
I have renamed the printInfo flag to isAppResourceReq and used the same for
kill and status cases also.
Please review and let me know your feedback.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]