[ 
https://issues.apache.org/jira/browse/MAHOUT-1546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13989819#comment-13989819
 ] 

Pat Ferrel commented on MAHOUT-1546:
------------------------------------

Scala works fine without SCALA_HOME. That doesn't apply here.
MAHOUT_HOME is set otherwise the fix wouldn't have worked.

Running "mahout -spark classpath" from bash returns nothing. In the bash script 
is looks like SPARK_HOME is checked but I haven't set it and Spark itself seems 
to run fine without it. In fact the Spark install doesn't ask you to set it. In 
any case setting that seems to fix the problem.

Do we have a wiki page that describes setup for Spark, maybe I'm a good guinea 
pig to write or edit it.

> building spark context fails due to incorrect classpath query
> -------------------------------------------------------------
>
>                 Key: MAHOUT-1546
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1546
>             Project: Mahout
>          Issue Type: Bug
>         Environment: Spark running locally
>            Reporter: Pat Ferrel
>            Assignee: Dmitriy Lyubimov
>            Priority: Critical
>
> The classpath retrieval is using a "-spark" flag that returns nothing, using 
> the default "mahout classpath" seems to get all needed jar paths so 
> commenting out the "-spark" makes it work for me. Not sure this is the best 
> fix though.
> This is in def mahoutSparkContext(...)
> {code}
>         //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, 
> "-spark", "classpath"))
>         val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, 
> "classpath"))
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to