[ 
https://issues.apache.org/jira/browse/MAHOUT-1546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13990069#comment-13990069
 ] 

Dmitriy Lyubimov commented on MAHOUT-1546:
------------------------------------------

sorry, i meant "SPARK_HOME" instead of "SCALA_HOME". 

(I think Spark workers do require SCALA_HOME in some situations, but that's 
unrelated Spark thing, we don't need SCALA_HOME)

> building spark context fails due to incorrect classpath query
> -------------------------------------------------------------
>
>                 Key: MAHOUT-1546
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1546
>             Project: Mahout
>          Issue Type: Bug
>         Environment: Spark running locally
>            Reporter: Pat Ferrel
>            Assignee: Dmitriy Lyubimov
>            Priority: Critical
>
> The classpath retrieval is using a "-spark" flag that returns nothing, using 
> the default "mahout classpath" seems to get all needed jar paths so 
> commenting out the "-spark" makes it work for me. Not sure this is the best 
> fix though.
> This is in def mahoutSparkContext(...)
> {code}
>         //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, 
> "-spark", "classpath"))
>         val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, 
> "classpath"))
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to