Ah, thanks.
On Tue, Aug 26, 2014 at 7:32 PM, Nan Zhu zhunanmcg...@gmail.com wrote:
Hi, Victor,
the issue for you to have different version in driver and cluster is that
you the master will shutdown your application due to the inconsistent
SerialVersionID in ExecutorState
Best,
--
Nan
I wanted to make sure that there's full compatibility between minor
releases. I have a project that has a dependency on spark-core so that it
can be a driver program and that I can test locally. However, when
connecting to a cluster you don't necessarily know what version you're
connecting to. Is
Is this a standalone mode cluster? We don't currently make this guarantee,
though it will likely work in 1.0.0 to 1.0.2. The problem though is that the
standalone mode grabs the executors' version of Spark code from what's
installed on the cluster, while your driver might be built against
Yes, we are standalone right now. Do you have literature why one would want
to consider Mesos or YARN for Spark deployments?
Sounds like I should try upgrading my project and seeing if everything
compiles without modification. Then I can connect to an existing 1.0.0
cluster and see what what
Things will definitely compile, and apps compiled on 1.0.0 should even be able
to link against 1.0.2 without recompiling. The only problem is if you run your
driver with 1.0.0 on its classpath, but the cluster has 1.0.2 in executors.
For Mesos and YARN vs standalone, the difference is that they
Hi, Victor,
the issue for you to have different version in driver and cluster is that you
the master will shutdown your application due to the inconsistent
SerialVersionID in ExecutorState
Best,
--
Nan Zhu
On Tuesday, August 26, 2014 at 10:10 PM, Matei Zaharia wrote:
Things will