Is this a standalone mode cluster? We don't currently make this guarantee, 
though it will likely work in 1.0.0 to 1.0.2. The problem though is that the 
standalone mode grabs the executors' version of Spark code from what's 
installed on the cluster, while your driver might be built against another 
version. On YARN and Mesos, you can more easily mix different versions of 
Spark, since each application ships its own Spark JAR (or references one from a 
URL), and this is used for both the driver and executors.

Matei

On August 26, 2014 at 6:10:57 PM, Victor Tso-Guillen (v...@paxata.com) wrote:

I wanted to make sure that there's full compatibility between minor releases. I 
have a project that has a dependency on spark-core so that it can be a driver 
program and that I can test locally. However, when connecting to a cluster you 
don't necessarily know what version you're connecting to. Is a 1.0.0 cluster 
binary compatible with a 1.0.2 driver program? Is a 1.0.0 driver program binary 
compatible with a 1.0.2 cluster?

Reply via email to