[ 
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14616941#comment-14616941
 ] 

ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------

Github user andrewpalumbo commented on the pull request:

    https://github.com/apache/mahout/pull/146#issuecomment-119261749
  
    Ok- I see what you are saying..  I don't think that we can access that 
field From the interpreter though.  ie:
    ```
    val sdc = mahoutSparkContext(...)  // or something like this...
    _interp.interepret("@transient implicit val sdc = this.sdc")
    ``` 
    will give an error like:
    ```
    error: value sdc is not a member of iwC$iwC$iwC$iwC.....
    ```
    I believe since the line is converted to an object by the interpreter not 
of type MahoutSparkILoop. 
    
    What we are currently doing is actually following the Spark REPL blueprint:
    
    // Create a SparkContext in SparkILoop
    
https://github.com/apache/spark/blob/branch-1.3/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1005
    
    // Create an other SparkContext in the Intrepter:
    
https://github.com/apache/spark/blob/branch-1.3/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoopInit.scala#L121
    
    Let me know if you think there's a better way around it.
     
    
    
    



> Spark 1.3
> ---------
>
>                 Key: MAHOUT-1653
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1653
>             Project: Mahout
>          Issue Type: Dependency upgrade
>            Reporter: Andrew Musselman
>            Assignee: Andrew Palumbo
>             Fix For: 0.11.0
>
>
> Support Spark 1.3



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to