Make sure you use the EC2 script that ships with Spark 0.8 instead of the one 
with 0.7. Each one defaults to its own version of Spark.

Matei

On Nov 26, 2013, at 1:27 PM, Walrus theCat <[email protected]> wrote:

> Hey Ashish,
> 
> Thanks a lot for your update.  I am probably using the old script, and was 
> unaware of a version option.  Much props.
> 
> 
> On Mon, Nov 25, 2013 at 9:41 PM, Ashish Rangole <[email protected]> wrote:
> Hi Walrus theCat,
> 
> We have been successfully using Spark 0.8 on EC2 ever since it was released 
> and we do this
> several times a day.
> 
> We use spark-ec2.py with the new version option (--spark-version=0.8.0), to 
> spin-up the Spark 0.8 cluster on ec2.
> The key is to use the new spark-ec2.py and not the old one.
> 
> The only change we had to make was to modify our imports to point to the new 
> apache
> package names, again as indicated in Spark 0.8 release notes.
> 
> What exactly are the errors you are seeing?
> 
> 
> On Mon, Nov 25, 2013 at 9:52 PM, Walrus theCat <[email protected]> wrote:
> Thanks Paco, but I have no problems running my application on EC2 when it's 
> using Spark 0.7.3.  What I'm looking for is a way to use Spark 0.8 on EC2.
> 
> Cheers,
> 
> 
> On Mon, Nov 25, 2013 at 5:01 PM, Paco Nathan <[email protected]> wrote:
> Not the answer to your specific question (official solution) but 
> https://elastic.mesosphere.io/ is a 3-step service to run Apache Mesos atop 
> EC2. Then there's a quick tutorial about running Apache Spark on that cluster 
> http://mesosphere.io/learn/run-spark-on-mesos/
> 
> 
> On Mon, Nov 25, 2013 at 4:57 PM, Walrus theCat <[email protected]> wrote:
> Hi,
> 
> I just updated my imports and tried to run my app using Spark 0.8, but it 
> breaks.  The AMI's spark-shell says it's 0.7.3 or thereabouts, which is what 
> my app previously used.  What is the official, step-by-step solution to using 
> Spark 0.8 on EC2?
> 
> Thanks
> 
> 
> 
> 

Reply via email to