[
https://issues.apache.org/jira/browse/MAHOUT-1489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13983533#comment-13983533
]
Dmitriy Lyubimov commented on MAHOUT-1489:
------------------------------------------
let me recap what i said before. No, you don't have to configure spark in any
special way.
(1) install spark *0.9.1*, make sure assembly is built, (*sb/sbt assembly in
SPARK_HOME*) set up SPARK_HOME, make sure $SPARK_HOME/bin/spark-classpath.sh
(or whatever this script is) produces no errors
(2) compile mahout, set up MAHOUT_HOME.
(3) try with local mode
{code}
MASTER="local" bin/mahout spark-shell
{code}
that should be enough. LMK what troubles are happening after that.
> Interactive Scala & Spark Bindings Shell & Script processor
> -----------------------------------------------------------
>
> Key: MAHOUT-1489
> URL: https://issues.apache.org/jira/browse/MAHOUT-1489
> Project: Mahout
> Issue Type: New Feature
> Affects Versions: 1.0
> Reporter: Saikat Kanjilal
> Assignee: Dmitriy Lyubimov
> Fix For: 1.0
>
> Attachments: MAHOUT-1489.patch, MAHOUT-1489.patch.1,
> mahout-spark-shell-running-standalone.png
>
>
> Build an interactive shell /scripting (just like spark shell). Something very
> similar in R interactive/script runner mode.
--
This message was sent by Atlassian JIRA
(v6.2#6252)