Marcelo Vanzin created SPARK-6399:
-------------------------------------

             Summary: Code compiled against 1.3.0 may not run against older 
Spark versions
                 Key: SPARK-6399
                 URL: https://issues.apache.org/jira/browse/SPARK-6399
             Project: Spark
          Issue Type: Bug
          Components: Documentation, Spark Core
    Affects Versions: 1.3.0
            Reporter: Marcelo Vanzin


Commit 65b987c3 re-organized the implicit conversions of RDDs so that they're 
easier to use. The problem is that scalac now generates code that will not run 
on older Spark versions if those conversions are used.

Basically, even if you explicitly import {{SparkContext._}}, scalac will 
generate references to the new methods in the {{RDD}} object instead. So the 
compiled code will reference code that doesn't exist in older versions of Spark.

You can work around this by explicitly calling the methods in the 
{{SparkContext}} object, although that's a little ugly.

We should at least document this limitation (if there's no way to fix it), 
since I believe forwards compatibility in the API was also a goal.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to