Hi,

With some minor changes to spark-ec2/spark/init.sh and writing your own
 "upgrade-spark.sh" script, you can upgrade spark in place.

(Make sure to call not only spark/init.sh but also spark/setup.sh, because
the latter uses copy-dir to get your ner version of spark to the slaves)

I wrote one so we could upgrade to a specific version of Spark (via
commit-hash) and used it to upgrade from 1.4.1. to 1.5.0

best,
Jason


On Thu, Nov 12, 2015 at 9:49 AM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> spark-ec2 does not offer a way to upgrade an existing cluster, and from
> what I gather, it wasn't intended to be used to manage long-lasting
> infrastructure. The recommended approach really is to just destroy your
> existing cluster and launch a new one with the desired configuration.
>
> If you want to upgrade the cluster in place, you'll probably have to do
> that manually. Otherwise, perhaps spark-ec2 is not the right tool, and
> instead you want one of those "grown-up" management tools like Ansible
> which can be setup to allow in-place upgrades. That'll take a bit of work,
> though.
>
> Nick
>
> On Wed, Nov 11, 2015 at 6:01 PM Augustus Hong <augus...@branchmetrics.io>
> wrote:
>
>> Hey All,
>>
>> I have a Spark cluster(running version 1.5.0) on EC2 launched with the
>> provided spark-ec2 scripts. If I want to upgrade Spark to 1.5.2 in the same
>> cluster, what's the safest / recommended way to do that?
>>
>>
>> I know I can spin up a new cluster running 1.5.2, but it doesn't seem
>> efficient to spin up a new cluster every time we need to upgrade.
>>
>>
>> Thanks,
>> Augustus
>>
>>
>>
>>
>>
>> --
>> [image: Branch Metrics mobile deep linking] <http://branch.io/>* Augustus
>> Hong*
>>  Data Analytics | Branch Metrics
>>  m 650-391-3369 | e augus...@branch.io
>>
>

Reply via email to