[
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14617428#comment-14617428
]
ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------
Github user andrewpalumbo commented on a diff in the pull request:
https://github.com/apache/mahout/pull/146#discussion_r34091459
--- Diff:
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
---
@@ -48,55 +77,81 @@ class MahoutSparkILoop extends SparkILoop {
conf.set("spark.executor.memory", "1g")
- sparkContext = mahoutSparkContext(
+ _interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = "Mahout Spark Shell",
customJars = jars,
sparkConf = conf
)
--- End diff --
uggh.. of course-- i was doing it backwards. the SparkContext (along with
the SparkDistributedContext) have to be created by a call to
```createSparkContext()``` before we can access sdc (hence the above NPE).
```
_interp.interpret("""
@transient val sc = {
val _sc = org.apache.spark.repl.Main.interp.createSparkContext()
println("Spark context available as sc.")
_sc
}
""")
_interp.interpret("""
@transient implicit val sdc:
org.apache.mahout.sparkbindings.SparkDistributedContext =
org.apache.spark.repl.Main.interp.asInstanceOf[org.apache.mahout.sparkbindings.shell.MahoutSparkILoop].sdc
""")
echoToShell("Mahout distributed context is available as \"implicit
val sdc\".")
```
works fine. Had it in my head that 2 SparkContexts were being created
rather than 2 MahoutSparkContextes...
> Spark 1.3
> ---------
>
> Key: MAHOUT-1653
> URL: https://issues.apache.org/jira/browse/MAHOUT-1653
> Project: Mahout
> Issue Type: Dependency upgrade
> Reporter: Andrew Musselman
> Assignee: Andrew Palumbo
> Fix For: 0.11.0
>
>
> Support Spark 1.3
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)