Github user tsudukim commented on the pull request:
https://github.com/apache/spark/pull/10789#issuecomment-173270284
Thank you @tritab .
Actually I'm not trying your new PR yet (because my new PC is not set up),
but I looked into your code and I've got 2 concerns about using `cd` or `pushd`
in the scripts.
The 1st is that as @tritab already mentioned, if we ctrl-c, our terminal
might be left in the SPARK_HOME folder.
And the 2nd is, if we change current directory, I'm worried that some
command which specifies relative path doesn't work properly. For example, when
we execute spark-submit on yarn, we specify application JAR file like this:
```
bin/spark-submit.cmd --master yarn ...(snip)... lib\spark-examples*.jar
```
If we change current directory, the relative path seems not to work.
Same problem might occur in other situations, like sending JARs when
`spark-submit`, or loading script when `spark-shell` or when `pyspark` etc.
Did you face some problems to use double quotations like
```
cmd /V /E /C "%~dp0spark-shell2.cmd" %*
```
instead of using `pushd` ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]