Github user liancheng commented on the pull request:
https://github.com/apache/spark/pull/1699#issuecomment-50829383
@vanzin @pwendell Are you suggesting something like this:
```
if no "--" exists in CLI arguments
treat all those options after primary resource as application options
else:
split them at "--"
take those before "--" as spark-submit options
take those after "--" as application options
```
This logic accepts the following invocation happily:
```
bin/spark-submit --class Foo user.jar --arg1
```
But we still need to rewrite the following one (which is mostly used in
those scripts that delegates to `spark-submit`)
```
bin/spark-submit --class Foo user.jar --master local --arg1
```
into
```
bin/spark-submit --class Foo user.jar --master local -- --arg1
```
Take `sbin/start-thriftserver.sh` as an example, the core lines of this
script is:
```bash
CLASS=org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
exec "FWDIR"/bin/spark-submit --class $CLASS spark-internal "$@"
```
Note that `$@` may contain both `spark-submit` options as well as
application options by design. We still require the *user* to add `--` after
`--master local`. Although a little better, this is still an incompatible
change.
(And I haven't updated the documentation yet, will do that after we come to
a final conclusion).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---