[
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14615685#comment-14615685
]
ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------
Github user dlyubimov commented on a diff in the pull request:
https://github.com/apache/mahout/pull/146#discussion_r33983540
--- Diff:
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
---
@@ -48,55 +77,63 @@ class MahoutSparkILoop extends SparkILoop {
conf.set("spark.executor.memory", "1g")
- sparkContext = mahoutSparkContext(
+ _interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = "Mahout Spark Shell",
customJars = jars,
sparkConf = conf
)
- echo("Created spark context..")
+ echoToShell("Created spark context..")
sparkContext
}
+ // need to change our SparkDistributedContext name to 'sc' since we
cannot override the
+ // private sparkCleanUp() method.
+ // this is technically not part of Sparks explicitly defined Developer
API though
+ // nothing in the SparkILoopInit.scala file is marked as such.
override def initializeSpark() {
- intp.beQuietDuring {
- command("""
+ _interp.beQuietDuring {
+ _interp.interpret("""
- @transient implicit val sdc:
org.apache.mahout.math.drm.DistributedContext =
+ @transient implicit val sc:
org.apache.mahout.math.drm.DistributedContext =
--- End diff --
yes.
Except we are already calling "createMahoutContext" during
createSparkContext, so we just need to stash its output as Mahout context
private attribute and pass it in here to implicit. (perhaps via a public getter
or val).
as for deinitalization, you are right -- all it does i think it stops the
spark context.
> Spark 1.3
> ---------
>
> Key: MAHOUT-1653
> URL: https://issues.apache.org/jira/browse/MAHOUT-1653
> Project: Mahout
> Issue Type: Dependency upgrade
> Reporter: Andrew Musselman
> Assignee: Andrew Palumbo
> Fix For: 0.11.0
>
>
> Support Spark 1.3
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)