+1 Make-distrubtion works, and also tested simple spark jobs on Spark
on Mesos on 8 node Mesos cluster.

Tim

On Thu, Aug 28, 2014 at 8:53 PM, Burak Yavuz <bya...@stanford.edu> wrote:
> +1. Tested MLlib algorithms on Amazon EC2, algorithms show speed-ups between 
> 1.5-5x compared to the 1.0.2 release.
>
> ----- Original Message -----
> From: "Patrick Wendell" <pwend...@gmail.com>
> To: dev@spark.apache.org
> Sent: Thursday, August 28, 2014 8:32:11 PM
> Subject: Re: [VOTE] Release Apache Spark 1.1.0 (RC2)
>
> I'll kick off the vote with a +1.
>
> On Thu, Aug 28, 2014 at 7:14 PM, Patrick Wendell <pwend...@gmail.com> wrote:
>> Please vote on releasing the following candidate as Apache Spark version 
>> 1.1.0!
>>
>> The tag to be voted on is v1.1.0-rc2 (commit 711aebb3):
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=711aebb329ca28046396af1e34395a0df92b5327
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-1.1.0-rc2/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1029/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-1.1.0-rc2-docs/
>>
>> Please vote on releasing this package as Apache Spark 1.1.0!
>>
>> The vote is open until Monday, September 01, at 03:11 UTC and passes if
>> a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 1.1.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>>
>> == Regressions fixed since RC1 ==
>> LZ4 compression issue: https://issues.apache.org/jira/browse/SPARK-3277
>>
>> == What justifies a -1 vote for this release? ==
>> This vote is happening very late into the QA period compared with
>> previous votes, so -1 votes should only occur for significant
>> regressions from 1.0.2. Bugs already present in 1.0.X will not block
>> this release.
>>
>> == What default changes should I be aware of? ==
>> 1. The default value of "spark.io.compression.codec" is now "snappy"
>> --> Old behavior can be restored by switching to "lzf"
>>
>> 2. PySpark now performs external spilling during aggregations.
>> --> Old behavior can be restored by setting "spark.shuffle.spill" to "false".
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to