+1 from me ass well.

On Tue, Apr 7, 2015 at 4:36 AM, Sean Owen <so...@cloudera.com> wrote:
> I think that's close enough for a +1:
>
> Signatures and hashes are good.
> LICENSE, NOTICE still check out.
> Compiles for a Hadoop 2.6 + YARN + Hive profile.
>
> JIRAs with target version = 1.2.x look legitimate; no blockers.
>
> I still observe several Hive test failures with:
> mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0
> -DskipTests clean package; mvn -Phadoop-2.4 -Pyarn -Phive
> -Phive-0.13.1 -Dhadoop.version=2.6.0 test
> .. though again I think these are not regressions but known issues in
> older branches.
>
> FYI there are 16 Critical issues still open for 1.2.x:
>
> SPARK-6209,ExecutorClassLoader can leak connections after failing to
> load classes from the REPL class server,Josh Rosen,In Progress,4/5/15
> SPARK-5098,Number of running tasks become negative after tasks
> lost,,Open,1/14/15
> SPARK-4888,"Spark EC2 doesn't mount local disks for i2.8xlarge
> instances",,Open,1/27/15
> SPARK-4879,Missing output partitions after job completes with
> speculative execution,Josh Rosen,Open,3/5/15
> SPARK-4568,Publish release candidates under $VERSION-RCX instead of
> $VERSION,Patrick Wendell,Open,11/24/14
> SPARK-4520,SparkSQL exception when reading certain columns from a
> parquet file,sadhan sood,Open,1/21/15
> SPARK-4514,SparkContext localProperties does not inherit property
> updates across thread reuse,Josh Rosen,Open,3/31/15
> SPARK-4454,Race condition in DAGScheduler,Josh Rosen,Reopened,2/18/15
> SPARK-4452,Shuffle data structures can starve others on the same
> thread for memory,Tianshuo Deng,Open,1/24/15
> SPARK-4356,Test Scala 2.11 on Jenkins,Patrick Wendell,Open,11/12/14
> SPARK-4258,NPE with new Parquet Filters,Cheng Lian,Reopened,4/3/15
> SPARK-4194,Exceptions thrown during SparkContext or SparkEnv
> construction might lead to resource leaks or corrupted global
> state,,In Progress,4/2/15
> SPARK-4159,"Maven build doesn't run JUnit test suites",Sean Owen,Open,1/11/15
> SPARK-4106,Shuffle write and spill to disk metrics are 
> incorrect,,Open,10/28/14
> SPARK-3492,Clean up Yarn integration code,Andrew Or,Open,9/12/14
> SPARK-3461,Support external groupByKey using
> repartitionAndSortWithinPartitions,Sandy Ryza,Open,11/10/14
> SPARK-2984,FileNotFoundException on _temporary directory,,Open,12/11/14
> SPARK-2532,Fix issues with consolidated shuffle,,Open,3/26/15
> SPARK-1312,Batch should read based on the batch interval provided in
> the StreamingContext,Tathagata Das,Open,12/24/14
>
> On Sun, Apr 5, 2015 at 7:24 PM, Patrick Wendell <pwend...@gmail.com> wrote:
>> Please vote on releasing the following candidate as Apache Spark version 
>> 1.2.2!
>>
>> The tag to be voted on is v1.2.2-rc1 (commit 7531b50):
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=7531b50e406ee2e3301b009ceea7c684272b2e27
>>
>> The list of fixes present in this release can be found at:
>> http://bit.ly/1DCNddt
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-1.2.2-rc1/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1082/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-1.2.2-rc1-docs/
>>
>> Please vote on releasing this package as Apache Spark 1.2.2!
>>
>> The vote is open until Thursday, April 08, at 00:30 UTC and passes
>> if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 1.2.2
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to