[ 
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14619311#comment-14619311
 ] 

ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------

Github user andrewpalumbo commented on a diff in the pull request:

    https://github.com/apache/mahout/pull/146#discussion_r34195169
  
    --- Diff: 
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
 ---
    @@ -48,55 +83,87 @@ class MahoutSparkILoop extends SparkILoop {
     
         conf.set("spark.executor.memory", "1g")
     
    -    sparkContext = mahoutSparkContext(
    +    sdc = mahoutSparkContext(
           masterUrl = master,
           appName = "Mahout Spark Shell",
           customJars = jars,
           sparkConf = conf
         )
     
    -    echo("Created spark context..")
    +    _interp.sparkContext = sdc
    +
    +    echoToShell("Created spark context..")
         sparkContext
       }
     
    +  // this is technically not part of Spark's explicitly defined Developer 
API though
    +  // nothing in the SparkILoopInit.scala file is marked as such.
       override def initializeSpark() {
    -    intp.beQuietDuring {
    -      command("""
     
    -         @transient implicit val sdc: 
org.apache.mahout.math.drm.DistributedContext =
    -            new org.apache.mahout.sparkbindings.SparkDistributedContext(
    -            org.apache.spark.repl.Main.interp.createSparkContext())
    +    _interp.beQuietDuring {
    +
    +      // get the spark context, at the same time create and store a mahout 
distributed context.
    +      _interp.interpret("""
    +         @transient val sc = {
    --- End diff --
    
    @dlyubimov I've made the necessary changes clean up the the redundant 
creation of a SparkDistributedContext.  Thanks alot for the input.  
    
    You mentioned is that we could declare the SparkContext as `implicit val sc 
= ...`. Is there a reason that it should be implicit?
    
    I've left it as `val sc =...` for now since that is the way Spark declares 
is in this method.     Thx.


> Spark 1.3
> ---------
>
>                 Key: MAHOUT-1653
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1653
>             Project: Mahout
>          Issue Type: Dependency upgrade
>            Reporter: Andrew Musselman
>            Assignee: Andrew Palumbo
>             Fix For: 0.11.0
>
>
> Support Spark 1.3



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to