[ 
https://issues.apache.org/jira/browse/SPARK-6673?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14459566#comment-14459566
 ] 

Masayoshi TSUZUKI commented on SPARK-6673:
------------------------------------------

Similar, but might be a different problem.
Because the original problem can be avoided just by setting SPARK_SCALA_VERSION 
variable.

Could you show the commands how to build Spark, and how to execute it?
And please re-build Spark after "clean", try it again, and paste the full 
output.

> spark-shell.cmd can't start even when spark was built in Windows
> ----------------------------------------------------------------
>
>                 Key: SPARK-6673
>                 URL: https://issues.apache.org/jira/browse/SPARK-6673
>             Project: Spark
>          Issue Type: Bug
>          Components: Windows
>    Affects Versions: 1.3.0
>            Reporter: Masayoshi TSUZUKI
>            Assignee: Masayoshi TSUZUKI
>            Priority: Blocker
>
> spark-shell.cmd can't start.
> {code}
> bin\spark-shell.cmd --master local
> {code}
> will get
> {code}
> Failed to find Spark assembly JAR.
> You need to build Spark before running this program.
> {code}
> even when we have built spark.
> This is because of the lack of the environment {{SPARK_SCALA_VERSION}} which 
> is used in {{spark-class2.cmd}}.
> In linux scripts, this value is set as {{2.10}} or {{2.11}} by default in 
> {{load-spark-env.sh}}, but there are no equivalent script in Windows.
> As workaround, by executing
> {code}
> set SPARK_SCALA_VERSION=2.10
> {code}
> before execute spark-shell.cmd, we can successfully start it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to