To be clear, is it 'compiled' against 1.0.2 or it packaged with it?
On Thu, Aug 14, 2014 at 6:39 PM, Mingyu Kim <m...@palantir.com> wrote: > I ran a really simple code that runs with Spark 1.0.2 jar and connects to > a Spark 1.0.1 cluster, but it fails with java.io.InvalidClassException. I > filed the bug at https://issues.apache.org/jira/browse/SPARK-3050. > > I assumed the minor and patch releases shouldn’t break compatibility. Is > that correct? > > Thanks, > Mingyu >