Here is the fix https://github.com/apache/spark/pull/13868
From: Reynold Xin [mailto:r...@databricks.com]
Sent: Wednesday, June 22, 2016 6:43 PM
To: Ulanov, Alexander <alexander.ula...@hpe.com>
Cc: Mark Hamstra <m...@clearstorydata.com>; Marcelo Vanzin 
<van...@cloudera.com>; dev@spark.apache.org
Subject: Re: [VOTE] Release Apache Spark 2.0.0 (RC1)

Alex - if you have access to a windows box, can you fix the issue? I'm not sure 
how many Spark contributors have windows boxes.


On Wed, Jun 22, 2016 at 5:56 PM, Ulanov, Alexander 
<alexander.ula...@hpe.com<mailto:alexander.ula...@hpe.com>> wrote:
Spark Unit tests fail on Windows in Spark 2.0. It can be considered as blocker 
since there are people that develop for Spark on Windows. The referenced issue 
is indeed Minor and has nothing to do with unit tests.

From: Mark Hamstra 
[mailto:m...@clearstorydata.com<mailto:m...@clearstorydata.com>]
Sent: Wednesday, June 22, 2016 4:09 PM
To: Marcelo Vanzin <van...@cloudera.com<mailto:van...@cloudera.com>>
Cc: Ulanov, Alexander 
<alexander.ula...@hpe.com<mailto:alexander.ula...@hpe.com>>; Reynold Xin 
<r...@databricks.com<mailto:r...@databricks.com>>; 
dev@spark.apache.org<mailto:dev@spark.apache.org>
Subject: Re: [VOTE] Release Apache Spark 2.0.0 (RC1)

It's also marked as Minor, not Blocker.

On Wed, Jun 22, 2016 at 4:07 PM, Marcelo Vanzin 
<van...@cloudera.com<mailto:van...@cloudera.com>> wrote:
On Wed, Jun 22, 2016 at 4:04 PM, Ulanov, Alexander
<alexander.ula...@hpe.com<mailto:alexander.ula...@hpe.com>> wrote:
> -1
>
> Spark Unit tests fail on Windows. Still not resolved, though marked as
> resolved.

To be pedantic, it's marked as a duplicate
(https://issues.apache.org/jira/browse/SPARK-15899), which doesn't
mean necessarily that it's fixed.



> https://issues.apache.org/jira/browse/SPARK-15893
>
> From: Reynold Xin [mailto:r...@databricks.com<mailto:r...@databricks.com>]
> Sent: Tuesday, June 21, 2016 6:27 PM
> To: dev@spark.apache.org<mailto:dev@spark.apache.org>
> Subject: [VOTE] Release Apache Spark 2.0.0 (RC1)
>
>
>
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vote is open until Friday, June 24, 2016 at 19:00 PDT and passes
> if a majority of at least 3+1 PMC votes are cast.
>
>
>
> [ ] +1 Release this package as Apache Spark 2.0.0
>
> [ ] -1 Do not release this package because ...
>
>
>
>
>
> The tag to be voted on is v2.0.0-rc1
> (0c66ca41afade6db73c9aeddd5aed6e5dcea90df).
>
>
>
> This release candidate resolves ~2400 issues:
> https://s.apache.org/spark-2.0.0-rc1-jira
>
>
>
> The release files, including signatures, digests, etc. can be found at:
>
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-bin/
>
>
>
> Release artifacts are signed with the following key:
>
> https://people.apache.org/keys/committer/pwendell.asc
>
>
>
> The staging repository for this release can be found at:
>
> https://repository.apache.org/content/repositories/orgapachespark-1187/
>
>
>
> The documentation corresponding to this release can be found at:
>
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-docs/
>
>
>
>
>
> =======================================
>
> == How can I help test this release? ==
>
> =======================================
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 1.x.
>
>
>
> ================================================
>
> == What justifies a -1 vote for this release? ==
>
> ================================================
>
> Critical bugs impacting major functionalities.
>
>
>
> Bugs already present in 1.x, missing features, or bugs related to new
> features will not necessarily block this release. Note that historically
> Spark documentation has been published on the website separately from the
> main release so we do not need to block the release due to documentation
> errors either.
>
>
>
>

--
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: 
dev-unsubscr...@spark.apache.org<mailto:dev-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
dev-h...@spark.apache.org<mailto:dev-h...@spark.apache.org>


Reply via email to