Github user raschild commented on the pull request:
https://github.com/apache/spark/pull/5261#issuecomment-87771770
To be more precise, I am running Spark on HPC and I have made the build on
scala-2.11; however when I started the master, the error I was receiving was
`echo "Failed to find Spark assembly in $assembly_folder" 1>&2`
`echo "You need to build Spark before running this program." 1>&2`
which were coming from the compute-classpath.sh script
So, I figured that the scala version was not passed correctly, since when
we are doing the sourcing
`. "$FWDIR"/bin/load-spark-env.sh` inside the the compute-classpath.sh, the
scala_version was assigned explicitly to 2.10, clearly it could not find the
assembly of scala-2.11 in (assembly/target/scala-2.11). After some closer
investigation I saw that when the check of if a scala-2.11 directory exists (in
the load-spark-env.sh) the $FWD_DIR did not have a valid value and it was
mistakenly giving a scala version of 2.10 since it couldn't find the correct
path; as such I suggested this change.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]