[
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14616960#comment-14616960
]
ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------
Github user andrewpalumbo commented on a diff in the pull request:
https://github.com/apache/mahout/pull/146#discussion_r34061617
--- Diff:
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
---
@@ -48,55 +77,81 @@ class MahoutSparkILoop extends SparkILoop {
conf.set("spark.executor.memory", "1g")
- sparkContext = mahoutSparkContext(
+ _interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = "Mahout Spark Shell",
customJars = jars,
sparkConf = conf
)
--- End diff --
Ok- I see what you are saying.. I don't think that we can access that
field From the interpreter though. ie:
```
val sdc = mahoutSparkContext(...) // or something like this...
_interp.interepret("@transient implicit val sdc = this.sdc")
```
will give an error like:
```
error: value sdc is not a member of iwC$iwC$iwC$iwC.....
```
I believe since the line is converted to an object by the interpreter not
of type MahoutSparkILoop.
What we are currently doing is actually following the Spark REPL blueprint:
// Create a SparkContext in SparkILoop
https://github.com/apache/spark/blob/branch-1.3/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1005
// Create an other SparkContext in the Intrepter:
https://github.com/apache/spark/blob/branch-1.3/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoopInit.scala#L121
Let me know if you think there's a better way around it.
oops.. this should have been in a line note..
> Spark 1.3
> ---------
>
> Key: MAHOUT-1653
> URL: https://issues.apache.org/jira/browse/MAHOUT-1653
> Project: Mahout
> Issue Type: Dependency upgrade
> Reporter: Andrew Musselman
> Assignee: Andrew Palumbo
> Fix For: 0.11.0
>
>
> Support Spark 1.3
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)