+1

Tested SparkR package manually on multiple platforms and checked different 
Hadoop release jar.

And previously tested the last RC on different R releases (see the last RC vote 
thread)

I found some differences in bin release jars created by the different options 
when running the make-release script, created this JIRA to track
https://issues.apache.org/jira/browse/SPARK-22202

I've checked to confirm these exist in 2.1.1 release so this isn't a 
regression, and hence my +1.

btw, I think we need to update this file for the new keys used in signing this 
release https://www.apache.org/dist/spark/KEYS


_____________________________
From: Liwei Lin <lwl...@gmail.com<mailto:lwl...@gmail.com>>
Sent: Wednesday, October 4, 2017 6:51 PM
Subject: Re: [VOTE] Spark 2.1.2 (RC4)
To: Spark dev list <dev@spark.apache.org<mailto:dev@spark.apache.org>>


+1 (non-binding)


Cheers,
Liwei

On Wed, Oct 4, 2017 at 4:03 PM, Nick Pentreath 
<nick.pentre...@gmail.com<mailto:nick.pentre...@gmail.com>> wrote:
Ah right! Was using a new cloud instance and didn't realize I was logged in as 
root! thanks

On Tue, 3 Oct 2017 at 21:13 Marcelo Vanzin 
<van...@cloudera.com<mailto:van...@cloudera.com>> wrote:
Maybe you're running as root (or the admin account on your OS)?

On Tue, Oct 3, 2017 at 12:12 PM, Nick Pentreath
<nick.pentre...@gmail.com<mailto:nick.pentre...@gmail.com>> wrote:
> Hmm I'm consistently getting this error in core tests:
>
> - SPARK-3697: ignore directories that cannot be read. *** FAILED ***
>   2 was not equal to 1 (FsHistoryProviderSuite.scala:146)
>
>
> Anyone else? Any insight? Perhaps it's my set up.
>
>>>
>>>
>>> On Tue, Oct 3, 2017 at 7:24 AM Holden Karau 
>>> <hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>> wrote:
>>>>
>>>> Please vote on releasing the following candidate as Apache Spark version
>>>> 2.1.2. The vote is open until Saturday October 7th at 9:00 PST and passes 
>>>> if
>>>> a majority of at least 3 +1 PMC votes are cast.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>>
>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>
>>>> The tag to be voted on is v2.1.2-rc4
>>>> (2abaea9e40fce81cd4626498e0f5c28a70917499)
>>>>
>>>> List of JIRA tickets resolved in this release can be found with this
>>>> filter.
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> https://home.apache.org/~holden/spark-2.1.2-rc4-bin/
>>>>
>>>> Release artifacts are signed with a key from:
>>>> https://people.apache.org/~holden/holdens_keys.asc
>>>>
>>>> The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapachespark-1252
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> https://people.apache.org/~holden/spark-2.1.2-rc4-docs/
>>>>
>>>>
>>>> FAQ
>>>>
>>>> How can I help test this release?
>>>>
>>>> If you are a Spark user, you can help us test this release by taking an
>>>> existing Spark workload and running on this release candidate, then
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>> can add the staging repository to your projects resolvers and test with the
>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>> up building with a out of date RC going forward).
>>>>
>>>> What should happen to JIRA tickets still targeting 2.1.2?
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>>
>>>> But my bug isn't fixed!??!
>>>>
>>>> In order to make timely releases, we will typically not hold the release
>>>> unless the bug in question is a regression from 2.1.1. That being said if
>>>> there is something which is a regression form 2.1.1 that has not been
>>>> correctly targeted please ping a committer to help target the issue (you 
>>>> can
>>>> see the open issues listed as impacting Spark 2.1.1 & 2.1.2)
>>>>
>>>> What are the unresolved issues targeted for 2.1.2?
>>>>
>>>> At this time there are no open unresolved issues.
>>>>
>>>> Is there anything different about this release?
>>>>
>>>> This is the first release in awhile not built on the AMPLAB Jenkins.
>>>> This is good because it means future releases can more easily be built and
>>>> signed securely (and I've been updating the documentation in
>>>> https://github.com/apache/spark-website/pull/66 as I progress), however the
>>>> chances of a mistake are higher with any change like this. If there
>>>> something you normally take for granted as correct when checking a release,
>>>> please double check this time :)
>>>>
>>>> Should I be committing code to branch-2.1?
>>>>
>>>> Thanks for asking! Please treat this stage in the RC process as "code
>>>> freeze" so bug fixes only. If you're uncertain if something should be back
>>>> ported please reach out. If you do commit to branch-2.1 please tag your 
>>>> JIRA
>>>> issue fix version for 2.1.3 and if we cut another RC I'll move the 2.1.3
>>>> fixed into 2.1.2 as appropriate.
>>>>
>>>> What happened to RC3?
>>>>
>>>> Some R+zinc interactions kept it from getting out the door.
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>
>>
>



--
Marcelo



Reply via email to