Hi all,

As this RC has multiple minor issues, I decide to mark this vote as failed
and start building RC6 now.

On Tue, Sep 28, 2021 at 2:20 PM Chao Sun <sunc...@apache.org> wrote:

> Looks like it's related to https://github.com/apache/spark/pull/34085. I
> filed https://issues.apache.org/jira/browse/SPARK-36873 to fix it.
>
> On Mon, Sep 27, 2021 at 6:00 PM Chao Sun <sunc...@apache.org> wrote:
>
>> Thanks. Trying it on my local machine now but it will probably take a
>> while. I think https://github.com/apache/spark/pull/34085 is more likely
>> to be relevant but don't yet have a clue how it could cause the issue.
>> Spark CI also passed for these.
>>
>> On Mon, Sep 27, 2021 at 5:29 PM Sean Owen <sro...@gmail.com> wrote:
>>
>>> I'm building and testing with
>>>
>>> mvn -Phadoop-3.2 -Phive -Phive-2.3 -Phive-thriftserver -Pkinesis-asl
>>> -Pkubernetes -Pmesos -Pnetlib-lgpl -Pscala-2.12 -Pspark-ganglia-lgpl
>>> -Psparkr -Pyarn ...
>>>
>>> I did a '-DskipTests clean install' and then 'test'; the problem arises
>>> only in 'test'.
>>>
>>> On Mon, Sep 27, 2021 at 6:58 PM Chao Sun <sunc...@apache.org> wrote:
>>>
>>>> Hmm it may be related to the commit. Sean: how do I reproduce this?
>>>>
>>>> On Mon, Sep 27, 2021 at 4:56 PM Sean Owen <sro...@gmail.com> wrote:
>>>>
>>>>> Another "is anyone else seeing this"? in compiling common/yarn-network:
>>>>>
>>>>> [ERROR] [Error]
>>>>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
>>>>> package com.google.common.annotations does not exist
>>>>> [ERROR] [Error]
>>>>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
>>>>> package com.google.common.base does not exist
>>>>> [ERROR] [Error]
>>>>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
>>>>> package com.google.common.collect does not exist
>>>>> ...
>>>>>
>>>>> I didn't see this in RC4, so, I wonder if a recent change affected
>>>>> something, but there are barely any changes since RC4. Anything touching
>>>>> YARN or Guava maybe, like:
>>>>>
>>>>> https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
>>>>> ?
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang <ltn...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Please vote on releasing the following candidate as
>>>>>> Apache Spark version 3.2.0.
>>>>>>
>>>>>> The vote is open until 11:59pm Pacific time September 29 and passes
>>>>>> if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 3.2.0
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>>>>
>>>>>> The tag to be voted on is v3.2.0-rc5 (commit
>>>>>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>>>>>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>>>>>
>>>>>> Signatures used for Spark RCs can be found in this file:
>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>>>>>
>>>>>> The list of bug fixes going into 3.2.0 can be found at the following
>>>>>> URL:
>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>>>>>
>>>>>> This release is using the release script of the tag v3.2.0-rc5.
>>>>>>
>>>>>>
>>>>>> FAQ
>>>>>>
>>>>>> =========================
>>>>>> How can I help test this release?
>>>>>> =========================
>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>> an existing Spark workload and running on this release candidate, then
>>>>>> reporting any regressions.
>>>>>>
>>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>>> the current RC and see if anything important breaks, in the Java/Scala
>>>>>> you can add the staging repository to your projects resolvers and test
>>>>>> with the RC (make sure to clean up the artifact cache before/after so
>>>>>> you don't end up building with a out of date RC going forward).
>>>>>>
>>>>>> ===========================================
>>>>>> What should happen to JIRA tickets still targeting 3.2.0?
>>>>>> ===========================================
>>>>>> The current list of open tickets targeted at 3.2.0 can be found at:
>>>>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>>>> Version/s" = 3.2.0
>>>>>>
>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>> fixes, documentation, and API tweaks that impact compatibility should
>>>>>> be worked on immediately. Everything else please retarget to an
>>>>>> appropriate release.
>>>>>>
>>>>>> ==================
>>>>>> But my bug isn't fixed?
>>>>>> ==================
>>>>>> In order to make timely releases, we will typically not hold the
>>>>>> release unless the bug in question is a regression from the previous
>>>>>> release. That being said, if there is something which is a regression
>>>>>> that has not been correctly targeted please ping me or a committer to
>>>>>> help target the issue.
>>>>>>
>>>>>

Reply via email to