sbt assembly/assembly

or if you are devloping spark i.e. working on spark code base itself not
rolling a distribution.
sbt assemble-deps

also

there is a convenience script called make-distribution.sh

HTH


On Sun, Jan 5, 2014 at 7:52 PM, Aureliano Buendia <[email protected]>wrote:

>
>
>
> On Sun, Jan 5, 2014 at 5:42 AM, Patrick Wendell <[email protected]>wrote:
>
>> I usually just use the existing launch scripts to create a correctly
>> sized cluster then just:
>>
>> 1. Copy spark/conf/* to /tmp
>> 2. rm -rf spark/
>> 3. Checkout and build new spark from github (you'll need to rename
>> incubator-spark to spark)
>>
>
> I called 'sbt assembly' at this point, and it took ages to build the
> examples. What would be the minimal build command, just to build enough
> stuff for production?
>
>
>
>> 4. Copy conf files back
>> 5. /root/spark-ec2/copy-dir --delete /root/spark
>>
>>
>> - Patrick
>>
>> On Sat, Jan 4, 2014 at 9:14 PM, Aureliano Buendia <[email protected]>
>> wrote:
>> > So I should just lanuch an AMI from one of
>> > https://github.com/mesos/spark-ec2/tree/v2/ami-list and build the
>> > development version on it? Is it just a simple matter of copying the
>> target
>> > to a directory, or does it need a lot of configurations?
>> >
>> >
>> >
>> > On Sun, Jan 5, 2014 at 5:03 AM, Patrick Wendell <[email protected]>
>> wrote:
>> >>
>> >> I meant you'll need to build your own version of Spark. Typically we
>> >> do this by launching an existing AMI and then just building a new
>> >> version of spark and copying it to the slaves....
>> >>
>> >> - Patrick
>> >>
>> >> On Sat, Jan 4, 2014 at 8:44 PM, Patrick Wendell <[email protected]>
>> >> wrote:
>> >> > You'll have to build your own. Also there are some packaging
>> >> > differences in master (some bin/ scripts moved to sbin/) just to give
>> >> > you a heads up.
>> >> >
>> >> > On Sat, Jan 4, 2014 at 8:14 PM, Aureliano Buendia <
>> [email protected]>
>> >> > wrote:
>> >> >> Good to know the next release will be on scala 2.10.
>> >> >>
>> >> >> Meanwhile, does using the master branch mean that I will have to
>> build
>> >> >> my
>> >> >> own AMI when launching an ec2 cluster? Also, is there a nightly
>> binary
>> >> >> build
>> >> >> maven repo available for spark?
>> >> >>
>> >> >>
>> >> >> On Sun, Jan 5, 2014 at 3:56 AM, Aaron Davidson <[email protected]>
>> >> >> wrote:
>> >> >>>
>> >> >>> Scala 2.10.3 support was recently merged into master (#259). The
>> >> >>> branch is
>> >> >>> probably not as stable as 0.8.1, but things "should" work.
>> >> >>>
>> >> >>> The 2.10 branch should be deleted, the only issue is there are some
>> >> >>> outstanding PRs against that branch that haven't been moved to
>> master.
>> >> >>>
>> >> >>>
>> >> >>> On Sat, Jan 4, 2014 at 7:11 PM, Aureliano Buendia
>> >> >>> <[email protected]>
>> >> >>> wrote:
>> >> >>>>
>> >> >>>> Hi,
>> >> >>>>
>> >> >>>> I was going to give https://github.com/scala/pickling a try on
>> spark
>> >> >>>> to
>> >> >>>> see how it would compare with kryo. Unfortunately, it only works
>> with
>> >> >>>> scala
>> >> >>>> 2.10.3.
>> >> >>>>
>> >> >>>> - Is there a time line for spark to work with scala 2.10?
>> >> >>>>
>> >> >>>> - Is the 2.10 branch as stable as 2.9?
>> >> >>>>
>> >> >>>> - What's blocking spark to work with 2.10?
>> >> >>>
>> >> >>>
>> >> >>
>> >
>> >
>>
>
>


-- 
Prashant

Reply via email to