Hi everyone,

There’s an open bug report related to Spark standalone which could be a 
potential release-blocker (pending investigation / a bug fix): 
https://issues.apache.org/jira/browse/SPARK-4498.  This issue seems 
non-deterministc and only affects long-running Spark standalone deployments, so 
it may be hard to reproduce.  I’m going to work on a patch to add additional 
logging in order to help with debugging.

I just wanted to give an early head’s up about this issue and to get more eyes 
on it in case anyone else has run into it or wants to help with debugging.

- Josh

On November 28, 2014 at 9:18:09 PM, Patrick Wendell (pwend...@gmail.com) wrote:

Please vote on releasing the following candidate as Apache Spark version 1.2.0! 
 

The tag to be voted on is v1.2.0-rc1 (commit 1056e9ec1):  
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=1056e9ec13203d0c51564265e94d77a054498fdb
  

The release files, including signatures, digests, etc. can be found at:  
http://people.apache.org/~pwendell/spark-1.2.0-rc1/  

Release artifacts are signed with the following key:  
https://people.apache.org/keys/committer/pwendell.asc  

The staging repository for this release can be found at:  
https://repository.apache.org/content/repositories/orgapachespark-1048/  

The documentation corresponding to this release can be found at:  
http://people.apache.org/~pwendell/spark-1.2.0-rc1-docs/  

Please vote on releasing this package as Apache Spark 1.2.0!  

The vote is open until Tuesday, December 02, at 05:15 UTC and passes  
if a majority of at least 3 +1 PMC votes are cast.  

[ ] +1 Release this package as Apache Spark 1.1.0  
[ ] -1 Do not release this package because ...  

To learn more about Apache Spark, please see  
http://spark.apache.org/  

== What justifies a -1 vote for this release? ==  
This vote is happening very late into the QA period compared with  
previous votes, so -1 votes should only occur for significant  
regressions from 1.0.2. Bugs already present in 1.1.X, minor  
regressions, or bugs related to new features will not block this  
release.  

== What default changes should I be aware of? ==  
1. The default value of "spark.shuffle.blockTransferService" has been  
changed to "netty"  
--> Old behavior can be restored by switching to "nio"  

2. The default value of "spark.shuffle.manager" has been changed to "sort".  
--> Old behavior can be restored by setting "spark.shuffle.manager" to "hash".  

== Other notes ==  
Because this vote is occurring over a weekend, I will likely extend  
the vote if this RC survives until the end of the vote period.  

- Patrick  

---------------------------------------------------------------------  
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org  
For additional commands, e-mail: dev-h...@spark.apache.org  

Reply via email to