[ 
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14606330#comment-14606330
 ] 

ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------

Github user pferrel commented on a diff in the pull request:

    https://github.com/apache/mahout/pull/136#discussion_r33508872
  
    --- Diff: 
spark/src/main/scala/org/apache/mahout/sparkbindings/drm/CheckpointedDrmSpark.scala
 ---
    @@ -165,7 +168,14 @@ class CheckpointedDrmSpark[K: ClassTag](
           else if (classOf[Writable].isAssignableFrom(ktag.runtimeClass)) (x: 
K) => x.asInstanceOf[Writable]
           else throw new IllegalArgumentException("Do not know how to convert 
class tag %s to Writable.".format(ktag))
     
    -    rdd.saveAsSequenceFile(path)
    --- End diff --
    
    So are you suggesting that we deprecate dfsWrite? Meaning leave it as is 
with a deprecated Spark function for now and replace it wherever it's called? 
Then when Spark removes the deprecated function we remove dfsWrite?
    
    Can we handle your code with an implicit conversion that will get called 
before dfsWrite?


> Spark 1.3
> ---------
>
>                 Key: MAHOUT-1653
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1653
>             Project: Mahout
>          Issue Type: Dependency upgrade
>            Reporter: Andrew Musselman
>            Assignee: Andrew Palumbo
>             Fix For: 0.11.0
>
>
> Support Spark 1.3



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to