Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/1969#issuecomment-53764382
Hi @iven, `spark-shell` actually goes through `spark-submit`. As @liancheng
mentioned, you can set `spark.home` to control the executor side Spark
location. This is not super intuitive, however, and there is an open PR that
adds a more specific way to do this. #2166
At least with the existing code, the user should not set `SPARK_HOME`
because the code depends on that in many places downstream. A better solution
is to set an application-specific config. Would you mind closing this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]