Okay I'm cancelling this vote in favor of RC2.

On Thu, Aug 28, 2014 at 3:27 PM, Mridul Muralidharan <mri...@gmail.com> wrote:
> Thanks for being on top of this Patrick ! And apologies for not being able
> to help more.
>
> Regards,
> Mridul
>
> On Aug 29, 2014 1:30 AM, "Patrick Wendell" <pwend...@gmail.com> wrote:
>>
>> Mridul - thanks for sending this along and for the debugging comments
>> on the JIRA. I think we have a handle on the issue and we'll patch it
>> and spin a new RC. We can also update the test coverage to cover LZ4.
>>
>> - Patrick
>>
>> On Thu, Aug 28, 2014 at 9:27 AM, Mridul Muralidharan <mri...@gmail.com>
>> wrote:
>> > Is SPARK-3277 applicable to 1.1 ?
>> > If yes, until it is fixed, I am -1 on the release (I am on break, so
>> > can't
>> > verify or help fix, sorry).
>> >
>> > Regards
>> > Mridul
>> >
>> > On 28-Aug-2014 9:33 pm, "Patrick Wendell" <pwend...@gmail.com> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> >> version
>> >> 1.1.0!
>> >>
>> >> The tag to be voted on is v1.1.0-rc1 (commit f0718324):
>> >>
>> >>
>> >> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=f07183249b74dd857069028bf7d570b35f265585
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> http://people.apache.org/~pwendell/spark-1.1.0-rc1/
>> >>
>> >> Release artifacts are signed with the following key:
>> >> https://people.apache.org/keys/committer/pwendell.asc
>> >>
>> >> The staging repository for this release can be found at:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1028/
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> http://people.apache.org/~pwendell/spark-1.1.0-rc1-docs/
>> >>
>> >> Please vote on releasing this package as Apache Spark 1.1.0!
>> >>
>> >> The vote is open until Sunday, August 31, at 17:00 UTC and passes if
>> >> a majority of at least 3 +1 PMC votes are cast.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 1.1.0
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see
>> >> http://spark.apache.org/
>> >>
>> >> == What justifies a -1 vote for this release? ==
>> >> This vote is happening very late into the QA period compared with
>> >> previous votes, so -1 votes should only occur for significant
>> >> regressions from 1.0.2. Bugs already present in 1.0.X will not block
>> >> this release.
>> >>
>> >> == What default changes should I be aware of? ==
>> >> 1. The default value of "spark.io.compression.codec" is now "snappy"
>> >> --> Old behavior can be restored by switching to "lzf"
>> >>
>> >> 2. PySpark now performs external spilling during aggregations.
>> >> --> Old behavior can be restored by setting "spark.shuffle.spill" to
>> >> "false".
>> >>
>> >> I'll send a bit more later today with feature information for the
>> >> release. In the mean time I want to put this out there for
>> >> consideration.
>> >>
>> >> - Patrick
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> >> For additional commands, e-mail: dev-h...@spark.apache.org
>> >>
>> >

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to