Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1699#issuecomment-50854779
Hey @liancheng I just spoke with @mateiz offline about this for a while. He
had feedback on a few things which I'll summarize here. There are a couple
orthogonal issues going on here. Here were his suggestions:
1. _Scripts like start-thriftserver.sh should expose one coherent set of
options rather than distinguishing between spark-submit and other options. _
The idea was to make this more convenient/less confusing for users. So this
would mean that the scripts would have to match on the more specific options
and make sure those are delivered separately to `spark-submit`. This makes the
internals more complicated, but for the users it's simpler. Also, because we
control the total set of options for internal tools, we can make sure there is
no collision in the naming.
2. _ If we implement `--` as a divider, it should be fully backwards
compatible. _ For instance, we need to support users that were doing this in
Spark 1.0:
```
./spark-submit --master local myJar.jar --userOpt a --userOpt b --
--userOpt c
```
i.e. user programs that used `--` in their own options. The way to fully
support this is to make the use of `--` mutually exclusive with specifying a
primary resource. So this means a user can _either_ do:
```
./spark-submit --master local --jars myJar.jar -- --userOpt a --userOpt b
```
or they can do
```
./spark-submit --master local myJar.jar -- --userOpt a --userOpt b
```
So basically, when the parser arrives at an unrecognized option (which we
assume to be a resource) we always treat the rest of the list as user options,
even if the user options happen to have a `--` in them.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---