Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/20809
> This is on travis and no SPARK_HOME is set.
That sounds a little odd. If that is true, then your proposed code wouldn't
work either, since it requires SPARK_HOME to be known.
In any case, there are two calls to `getScalaVersion()`.
First is:
```
boolean prependClasses = !isEmpty(getenv("SPARK_PREPEND_CLASSES"));
boolean isTesting = "1".equals(getenv("SPARK_TESTING"));
if (prependClasses || isTesting) {
String scala = getScalaVersion();
```
And your code shouldn't be triggering that, since both env variables are
for Spark development and other applications shouldn't be using them.
Second call is a little later:
```
String jarsDir = findJarsDir(getSparkHome(), getScalaVersion(),
!isTesting && !isTestingSql);
```
Here `getScalaVersion()` is only needed when running Spark from a git
clone, not from the distribution package. So the right thing would be to move
`getScalaVersion()` to `CommandBuilderUtils`, and call it from `findJarsDir`
only if needed.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]