Github user raschild commented on the pull request:
https://github.com/apache/spark/pull/5261#issuecomment-90995492
@srowen - Title Updated
I applied Marcello's comment but still it couldn't work. The whole point
for `$FWDIR` is in these two lines:
`ASSEMBLY_DIR2="$FWDIR/assembly/target/scala-2.11"`
`ASSEMBLY_DIR1="$FWDIR/assembly/target/scala-2.10"`
if you do not set up properly the directory then the if condition will fall
into the scala-2.10 and this is when the error occurs:
``
if [ -d "$ASSEMBLY_DIR2" ]; then
export SPARK_SCALA_VERSION="2.11"
else
export SPARK_SCALA_VERSION="2.10"
fi
``
Overall from what I have seen this applies to most of the spark development
packages. And yes as you have stated before the net change in previous branches
is just to set FWDIR differently! Otherwise, another way to work around it,
could be to source a script with all the global variables [ i.e. `$SPARK_HOME`.
]
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]