This should be okay, but make sure that your cluster also has the right code 
deployed. Maybe you have the wrong one.

If you built Spark from source multiple times, you may also want to try sbt 
clean before sbt assembly.

Matei

On August 1, 2014 at 12:00:07 PM, SK (skrishna...@gmail.com) wrote:


Hi, 

I upgraded to 1.0.1 from 1.0 a couple of weeks ago and have been able to use 
some of the features advertised in 1.0.1. However, I get some compilation 
errors in some cases and based on user response, these errors have been 
addressed in the 1.0.1 version and so I should not be getting these errors. 
So I want to make sure I followed the correct upgrade process as below (I am 
running Spark on single machine in standalone mode - so no cluster 
deployment): 

- set SPARK_HOME to the new version 

- run "sbt assembly" in SPARK_HOME to build the new Spark jars 

- in the project sbt file point the libraryDependencies for spark-core and 
other libraries to the 1.0.1 version and run "sbt assembly" to build the 
project jar. 

Is there anything else I need to do to ensure that no old jars are being 
used? For example do I need to manually delete any old jars? 

thanks 



-- 
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/correct-upgrade-process-tp11194.html
 
Sent from the Apache Spark User List mailing list archive at Nabble.com. 

Reply via email to