Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-28 Thread Stephen Coy
For posterity:

**the** way to prepare the Java environment in a terminal session on MacOS is 
as follows:

export JAVA_HOME=$(/usr/libexec/java_home -v1.8)
or
export JAVA_HOME=$(/usr/libexec/java_home -v11)
etc

There is no need to mess with $PATH or anything else. It has been like this for 
at least 20 years ;-).

Cheers,

Steve C

On 28 Sep 2021, at 1:37 am, Sean Owen 
mailto:sro...@gmail.com>> wrote:

Hm... it does just affect Mac OS (?) and only if you don't have JAVA_HOME set 
(which people often do set) and only affects build/mvn, vs built-in maven 
(which people often have installed). Only affects those building. I'm on the 
fence about whether it blocks 3.2.0, as it doesn't affect downstream users and 
is easily resolvable.

On Mon, Sep 27, 2021 at 10:26 AM sarutak 
mailto:saru...@oss.nttdata.com>> wrote:
Hi All,

SPARK-35887 seems to have introduced another issue that building with
build/mvn on macOS stucks, and SPARK-36856 will resolve this issue.
Should we meet the fix to 3.2.0?

- Kousuke

> Please vote on releasing the following candidate as Apache Spark
> version 3.2.0.
>
> The vote is open until 11:59pm Pacific time September 29 and passes if
> a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.0
>
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see 
> http://spark.apache.org/
>
> The tag to be voted on is v3.2.0-rc5 (commit
> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>
> https://github.com/apache/spark/tree/v3.2.0-rc5
>
> The release files, including signatures, digests, etc. can be found
> at:
>
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>
> Signatures used for Spark RCs can be found in this file:
>
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
>
> https://repository.apache.org/content/repositories/orgapachespark-1392
>
> The documentation corresponding to this release can be found at:
>
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>
> The list of bug fixes going into 3.2.0 can be found at the following
> URL:
>
> 

Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-28 Thread Gengliang Wang
Hi all,

As this RC has multiple minor issues, I decide to mark this vote as failed
and start building RC6 now.

On Tue, Sep 28, 2021 at 2:20 PM Chao Sun  wrote:

> Looks like it's related to https://github.com/apache/spark/pull/34085. I
> filed https://issues.apache.org/jira/browse/SPARK-36873 to fix it.
>
> On Mon, Sep 27, 2021 at 6:00 PM Chao Sun  wrote:
>
>> Thanks. Trying it on my local machine now but it will probably take a
>> while. I think https://github.com/apache/spark/pull/34085 is more likely
>> to be relevant but don't yet have a clue how it could cause the issue.
>> Spark CI also passed for these.
>>
>> On Mon, Sep 27, 2021 at 5:29 PM Sean Owen  wrote:
>>
>>> I'm building and testing with
>>>
>>> mvn -Phadoop-3.2 -Phive -Phive-2.3 -Phive-thriftserver -Pkinesis-asl
>>> -Pkubernetes -Pmesos -Pnetlib-lgpl -Pscala-2.12 -Pspark-ganglia-lgpl
>>> -Psparkr -Pyarn ...
>>>
>>> I did a '-DskipTests clean install' and then 'test'; the problem arises
>>> only in 'test'.
>>>
>>> On Mon, Sep 27, 2021 at 6:58 PM Chao Sun  wrote:
>>>
 Hmm it may be related to the commit. Sean: how do I reproduce this?

 On Mon, Sep 27, 2021 at 4:56 PM Sean Owen  wrote:

> Another "is anyone else seeing this"? in compiling common/yarn-network:
>
> [ERROR] [Error]
> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
> package com.google.common.annotations does not exist
> [ERROR] [Error]
> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
> package com.google.common.base does not exist
> [ERROR] [Error]
> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
> package com.google.common.collect does not exist
> ...
>
> I didn't see this in RC4, so, I wonder if a recent change affected
> something, but there are barely any changes since RC4. Anything touching
> YARN or Guava maybe, like:
>
> https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
> ?
>
>
>
> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang 
> wrote:
>
>> Please vote on releasing the following candidate as
>> Apache Spark version 3.2.0.
>>
>> The vote is open until 11:59pm Pacific time September 29 and passes
>> if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.2.0-rc5 (commit
>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>
>> The release files, including signatures, digests, etc. can be found
>> at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>
>> The list of bug fixes going into 3.2.0 can be found at the following
>> URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>
>> This release is using the release script of the tag v3.2.0-rc5.
>>
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.2.0?
>> ===
>> The current list of open tickets targeted at 3.2.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.2.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else 

Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-28 Thread Chao Sun
Looks like it's related to https://github.com/apache/spark/pull/34085. I
filed https://issues.apache.org/jira/browse/SPARK-36873 to fix it.

On Mon, Sep 27, 2021 at 6:00 PM Chao Sun  wrote:

> Thanks. Trying it on my local machine now but it will probably take a
> while. I think https://github.com/apache/spark/pull/34085 is more likely
> to be relevant but don't yet have a clue how it could cause the issue.
> Spark CI also passed for these.
>
> On Mon, Sep 27, 2021 at 5:29 PM Sean Owen  wrote:
>
>> I'm building and testing with
>>
>> mvn -Phadoop-3.2 -Phive -Phive-2.3 -Phive-thriftserver -Pkinesis-asl
>> -Pkubernetes -Pmesos -Pnetlib-lgpl -Pscala-2.12 -Pspark-ganglia-lgpl
>> -Psparkr -Pyarn ...
>>
>> I did a '-DskipTests clean install' and then 'test'; the problem arises
>> only in 'test'.
>>
>> On Mon, Sep 27, 2021 at 6:58 PM Chao Sun  wrote:
>>
>>> Hmm it may be related to the commit. Sean: how do I reproduce this?
>>>
>>> On Mon, Sep 27, 2021 at 4:56 PM Sean Owen  wrote:
>>>
 Another "is anyone else seeing this"? in compiling common/yarn-network:

 [ERROR] [Error]
 /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
 package com.google.common.annotations does not exist
 [ERROR] [Error]
 /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
 package com.google.common.base does not exist
 [ERROR] [Error]
 /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
 package com.google.common.collect does not exist
 ...

 I didn't see this in RC4, so, I wonder if a recent change affected
 something, but there are barely any changes since RC4. Anything touching
 YARN or Guava maybe, like:

 https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
 ?



 On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang 
 wrote:

> Please vote on releasing the following candidate as
> Apache Spark version 3.2.0.
>
> The vote is open until 11:59pm Pacific time September 29 and passes if
> a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.2.0-rc5 (commit
> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
> https://github.com/apache/spark/tree/v3.2.0-rc5
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1392
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>
> The list of bug fixes going into 3.2.0 can be found at the following
> URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>
> This release is using the release script of the tag v3.2.0-rc5.
>
>
> FAQ
>
> =
> How can I help test this release?
> =
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.2.0?
> ===
> The current list of open tickets targeted at 3.2.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.2.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous

Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Chao Sun
Thanks. Trying it on my local machine now but it will probably take a
while. I think https://github.com/apache/spark/pull/34085 is more likely to
be relevant but don't yet have a clue how it could cause the issue. Spark
CI also passed for these.

On Mon, Sep 27, 2021 at 5:29 PM Sean Owen  wrote:

> I'm building and testing with
>
> mvn -Phadoop-3.2 -Phive -Phive-2.3 -Phive-thriftserver -Pkinesis-asl
> -Pkubernetes -Pmesos -Pnetlib-lgpl -Pscala-2.12 -Pspark-ganglia-lgpl
> -Psparkr -Pyarn ...
>
> I did a '-DskipTests clean install' and then 'test'; the problem arises
> only in 'test'.
>
> On Mon, Sep 27, 2021 at 6:58 PM Chao Sun  wrote:
>
>> Hmm it may be related to the commit. Sean: how do I reproduce this?
>>
>> On Mon, Sep 27, 2021 at 4:56 PM Sean Owen  wrote:
>>
>>> Another "is anyone else seeing this"? in compiling common/yarn-network:
>>>
>>> [ERROR] [Error]
>>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
>>> package com.google.common.annotations does not exist
>>> [ERROR] [Error]
>>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
>>> package com.google.common.base does not exist
>>> [ERROR] [Error]
>>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
>>> package com.google.common.collect does not exist
>>> ...
>>>
>>> I didn't see this in RC4, so, I wonder if a recent change affected
>>> something, but there are barely any changes since RC4. Anything touching
>>> YARN or Guava maybe, like:
>>>
>>> https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
>>> ?
>>>
>>>
>>>
>>> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:
>>>
 Please vote on releasing the following candidate as
 Apache Spark version 3.2.0.

 The vote is open until 11:59pm Pacific time September 29 and passes if
 a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

 [ ] +1 Release this package as Apache Spark 3.2.0
 [ ] -1 Do not release this package because ...

 To learn more about Apache Spark, please see http://spark.apache.org/

 The tag to be voted on is v3.2.0-rc5 (commit
 49aea14c5afd93ae1b9d19b661cc273a557853f5):
 https://github.com/apache/spark/tree/v3.2.0-rc5

 The release files, including signatures, digests, etc. can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/

 Signatures used for Spark RCs can be found in this file:
 https://dist.apache.org/repos/dist/dev/spark/KEYS

 The staging repository for this release can be found at:
 https://repository.apache.org/content/repositories/orgapachespark-1392

 The documentation corresponding to this release can be found at:
 https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/

 The list of bug fixes going into 3.2.0 can be found at the following
 URL:
 https://issues.apache.org/jira/projects/SPARK/versions/12349407

 This release is using the release script of the tag v3.2.0-rc5.


 FAQ

 =
 How can I help test this release?
 =
 If you are a Spark user, you can help us test this release by taking
 an existing Spark workload and running on this release candidate, then
 reporting any regressions.

 If you're working in PySpark you can set up a virtual env and install
 the current RC and see if anything important breaks, in the Java/Scala
 you can add the staging repository to your projects resolvers and test
 with the RC (make sure to clean up the artifact cache before/after so
 you don't end up building with a out of date RC going forward).

 ===
 What should happen to JIRA tickets still targeting 3.2.0?
 ===
 The current list of open tickets targeted at 3.2.0 can be found at:
 https://issues.apache.org/jira/projects/SPARK and search for "Target
 Version/s" = 3.2.0

 Committers should look at those and triage. Extremely important bug
 fixes, documentation, and API tweaks that impact compatibility should
 be worked on immediately. Everything else please retarget to an
 appropriate release.

 ==
 But my bug isn't fixed?
 ==
 In order to make timely releases, we will typically not hold the
 release unless the bug in question is a regression from the previous
 release. That being said, if there is something which is a regression
 that has not been correctly targeted please ping me or a committer to
 help target the issue.

>>>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Sean Owen
I'm building and testing with

mvn -Phadoop-3.2 -Phive -Phive-2.3 -Phive-thriftserver -Pkinesis-asl
-Pkubernetes -Pmesos -Pnetlib-lgpl -Pscala-2.12 -Pspark-ganglia-lgpl
-Psparkr -Pyarn ...

I did a '-DskipTests clean install' and then 'test'; the problem arises
only in 'test'.

On Mon, Sep 27, 2021 at 6:58 PM Chao Sun  wrote:

> Hmm it may be related to the commit. Sean: how do I reproduce this?
>
> On Mon, Sep 27, 2021 at 4:56 PM Sean Owen  wrote:
>
>> Another "is anyone else seeing this"? in compiling common/yarn-network:
>>
>> [ERROR] [Error]
>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
>> package com.google.common.annotations does not exist
>> [ERROR] [Error]
>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
>> package com.google.common.base does not exist
>> [ERROR] [Error]
>> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
>> package com.google.common.collect does not exist
>> ...
>>
>> I didn't see this in RC4, so, I wonder if a recent change affected
>> something, but there are barely any changes since RC4. Anything touching
>> YARN or Guava maybe, like:
>>
>> https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
>> ?
>>
>>
>>
>> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:
>>
>>> Please vote on releasing the following candidate as
>>> Apache Spark version 3.2.0.
>>>
>>> The vote is open until 11:59pm Pacific time September 29 and passes if a
>>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 3.2.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>
>>> The tag to be voted on is v3.2.0-rc5 (commit
>>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>>
>>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>>
>>> This release is using the release script of the tag v3.2.0-rc5.
>>>
>>>
>>> FAQ
>>>
>>> =
>>> How can I help test this release?
>>> =
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 3.2.0?
>>> ===
>>> The current list of open tickets targeted at 3.2.0 can be found at:
>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.2.0
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==
>>> But my bug isn't fixed?
>>> ==
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not been correctly targeted please ping me or a committer to
>>> help target the issue.
>>>
>>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Chao Sun
Hmm it may be related to the commit. Sean: how do I reproduce this?

On Mon, Sep 27, 2021 at 4:56 PM Sean Owen  wrote:

> Another "is anyone else seeing this"? in compiling common/yarn-network:
>
> [ERROR] [Error]
> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
> package com.google.common.annotations does not exist
> [ERROR] [Error]
> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
> package com.google.common.base does not exist
> [ERROR] [Error]
> /mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
> package com.google.common.collect does not exist
> ...
>
> I didn't see this in RC4, so, I wonder if a recent change affected
> something, but there are barely any changes since RC4. Anything touching
> YARN or Guava maybe, like:
>
> https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
> ?
>
>
>
> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:
>
>> Please vote on releasing the following candidate as
>> Apache Spark version 3.2.0.
>>
>> The vote is open until 11:59pm Pacific time September 29 and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.2.0-rc5 (commit
>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>
>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>
>> This release is using the release script of the tag v3.2.0-rc5.
>>
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.2.0?
>> ===
>> The current list of open tickets targeted at 3.2.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.2.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Sean Owen
Another "is anyone else seeing this"? in compiling common/yarn-network:

[ERROR] [Error]
/mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:32:
package com.google.common.annotations does not exist
[ERROR] [Error]
/mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:33:
package com.google.common.base does not exist
[ERROR] [Error]
/mnt/data/testing/spark-3.2.0/common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:34:
package com.google.common.collect does not exist
...

I didn't see this in RC4, so, I wonder if a recent change affected
something, but there are barely any changes since RC4. Anything touching
YARN or Guava maybe, like:
https://github.com/apache/spark/commit/540e45c3cc7c64e37aa5c1673c03a0f2d7462878
?



On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:

> Please vote on releasing the following candidate as
> Apache Spark version 3.2.0.
>
> The vote is open until 11:59pm Pacific time September 29 and passes if a
> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.2.0-rc5 (commit
> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
> https://github.com/apache/spark/tree/v3.2.0-rc5
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1392
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>
> The list of bug fixes going into 3.2.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>
> This release is using the release script of the tag v3.2.0-rc5.
>
>
> FAQ
>
> =
> How can I help test this release?
> =
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.2.0?
> ===
> The current list of open tickets targeted at 3.2.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.2.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Holden Karau
I think even if we do cancel this RC we should leave it open for a bit to
see if we can catch any other errors.

On Mon, Sep 27, 2021 at 12:29 PM Dongjoon Hyun 
wrote:

> Unfortunately, it's the same for me recently. Not only that, but I also
> hit MetaspaceSize OOM, too.
> I ended up with MAVEN_OPTS like the following.
>
> -Xms12g -Xmx12g -Xss128M -XX:MaxMetaspaceSize=4g ...
>
> Dongjoon.
>
>
> On Mon, Sep 27, 2021 at 12:18 PM Sean Owen  wrote:
>
>> Has anyone seen a StackOverflowError when running tests? It happens in
>> compilation. I heard from another user who hit this earlier, and I had not,
>> until just today testing this:
>>
>> [ERROR] ## Exception when compiling 495 sources to
>> /mnt/data/testing/spark-3.2.0/sql/catalyst/target/scala-2.12/classes
>> java.lang.StackOverflowError
>>
>> scala.tools.nsc.transform.TypingTransformers$TypingTransformer.atOwner(TypingTransformers.scala:38)
>> scala.reflect.internal.Trees.itransform(Trees.scala:1420)
>> scala.reflect.internal.Trees.itransform$(Trees.scala:1400)
>> scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28)
>> ...
>>
>> Upping the JVM thread stack size to, say, 16m from 4m in the pom.xml file
>> made it work. I presume this could be somehow env-specific, as clearly the
>> CI/CD tests and release process built successfully. Just checking if it's
>> "just me".
>>
>>
>> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:
>>
>>> Please vote on releasing the following candidate as
>>> Apache Spark version 3.2.0.
>>>
>>> The vote is open until 11:59pm Pacific time September 29 and passes if a
>>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>
>>> [ ] +1 Release this package as Apache Spark 3.2.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see http://spark.apache.org/
>>>
>>> The tag to be voted on is v3.2.0-rc5 (commit
>>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>>
>>> Signatures used for Spark RCs can be found in this file:
>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>>
>>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>>
>>> This release is using the release script of the tag v3.2.0-rc5.
>>>
>>>
>>> FAQ
>>>
>>> =
>>> How can I help test this release?
>>> =
>>> If you are a Spark user, you can help us test this release by taking
>>> an existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala
>>> you can add the staging repository to your projects resolvers and test
>>> with the RC (make sure to clean up the artifact cache before/after so
>>> you don't end up building with a out of date RC going forward).
>>>
>>> ===
>>> What should happen to JIRA tickets still targeting 3.2.0?
>>> ===
>>> The current list of open tickets targeted at 3.2.0 can be found at:
>>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>>> Version/s" = 3.2.0
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should
>>> be worked on immediately. Everything else please retarget to an
>>> appropriate release.
>>>
>>> ==
>>> But my bug isn't fixed?
>>> ==
>>> In order to make timely releases, we will typically not hold the
>>> release unless the bug in question is a regression from the previous
>>> release. That being said, if there is something which is a regression
>>> that has not been correctly targeted please ping me or a committer to
>>> help target the issue.
>>>
>>

-- 
Twitter: https://twitter.com/holdenkarau
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  
YouTube Live Streams: https://www.youtube.com/user/holdenkarau


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Dongjoon Hyun
Unfortunately, it's the same for me recently. Not only that, but I also hit
MetaspaceSize OOM, too.
I ended up with MAVEN_OPTS like the following.

-Xms12g -Xmx12g -Xss128M -XX:MaxMetaspaceSize=4g ...

Dongjoon.


On Mon, Sep 27, 2021 at 12:18 PM Sean Owen  wrote:

> Has anyone seen a StackOverflowError when running tests? It happens in
> compilation. I heard from another user who hit this earlier, and I had not,
> until just today testing this:
>
> [ERROR] ## Exception when compiling 495 sources to
> /mnt/data/testing/spark-3.2.0/sql/catalyst/target/scala-2.12/classes
> java.lang.StackOverflowError
>
> scala.tools.nsc.transform.TypingTransformers$TypingTransformer.atOwner(TypingTransformers.scala:38)
> scala.reflect.internal.Trees.itransform(Trees.scala:1420)
> scala.reflect.internal.Trees.itransform$(Trees.scala:1400)
> scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28)
> ...
>
> Upping the JVM thread stack size to, say, 16m from 4m in the pom.xml file
> made it work. I presume this could be somehow env-specific, as clearly the
> CI/CD tests and release process built successfully. Just checking if it's
> "just me".
>
>
> On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:
>
>> Please vote on releasing the following candidate as
>> Apache Spark version 3.2.0.
>>
>> The vote is open until 11:59pm Pacific time September 29 and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.2.0-rc5 (commit
>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
>> https://github.com/apache/spark/tree/v3.2.0-rc5
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1392
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>>
>> The list of bug fixes going into 3.2.0 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>>
>> This release is using the release script of the tag v3.2.0-rc5.
>>
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.2.0?
>> ===
>> The current list of open tickets targeted at 3.2.0 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.2.0
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Sean Owen
Has anyone seen a StackOverflowError when running tests? It happens in
compilation. I heard from another user who hit this earlier, and I had not,
until just today testing this:

[ERROR] ## Exception when compiling 495 sources to
/mnt/data/testing/spark-3.2.0/sql/catalyst/target/scala-2.12/classes
java.lang.StackOverflowError
scala.tools.nsc.transform.TypingTransformers$TypingTransformer.atOwner(TypingTransformers.scala:38)
scala.reflect.internal.Trees.itransform(Trees.scala:1420)
scala.reflect.internal.Trees.itransform$(Trees.scala:1400)
scala.reflect.internal.SymbolTable.itransform(SymbolTable.scala:28)
...

Upping the JVM thread stack size to, say, 16m from 4m in the pom.xml file
made it work. I presume this could be somehow env-specific, as clearly the
CI/CD tests and release process built successfully. Just checking if it's
"just me".


On Mon, Sep 27, 2021 at 7:56 AM Gengliang Wang  wrote:

> Please vote on releasing the following candidate as
> Apache Spark version 3.2.0.
>
> The vote is open until 11:59pm Pacific time September 29 and passes if a
> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.2.0-rc5 (commit
> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
> https://github.com/apache/spark/tree/v3.2.0-rc5
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1392
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>
> The list of bug fixes going into 3.2.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>
> This release is using the release script of the tag v3.2.0-rc5.
>
>
> FAQ
>
> =
> How can I help test this release?
> =
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.2.0?
> ===
> The current list of open tickets targeted at 3.2.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.2.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Gengliang Wang
Hi Kousuke,

I tend to agree with Sean. It only affects the macOS developers when
building Spark with the released Spark 3.2 code tarball without setting
JAVA_HOME.
I can mention this one as a known issue in the release note if this vote
passes.

Thanks,
Gengliang

On Mon, Sep 27, 2021 at 11:47 PM sarutak  wrote:

> I think it affects devs but there are some workarounds.
> So, if you all don't think it's necessary to meet 3.2.0, I'm OK not to
> do it.
>
> - Kousuke
>
> > Hm... it does just affect Mac OS (?) and only if you don't have
> > JAVA_HOME set (which people often do set) and only affects build/mvn,
> > vs built-in maven (which people often have installed). Only affects
> > those building. I'm on the fence about whether it blocks 3.2.0, as it
> > doesn't affect downstream users and is easily resolvable.
> >
> > On Mon, Sep 27, 2021 at 10:26 AM sarutak 
> > wrote:
> >
> >> Hi All,
> >>
> >> SPARK-35887 seems to have introduced another issue that building
> >> with
> >> build/mvn on macOS stucks, and SPARK-36856 will resolve this issue.
> >> Should we meet the fix to 3.2.0?
> >>
> >> - Kousuke
> >>
> >>> Please vote on releasing the following candidate as Apache Spark
> >>> version 3.2.0.
> >>>
> >>> The vote is open until 11:59pm Pacific time September 29 and
> >> passes if
> >>> a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>>
> >>> [ ] +1 Release this package as Apache Spark 3.2.0
> >>>
> >>> [ ] -1 Do not release this package because ...
> >>>
> >>> To learn more about Apache Spark, please see
> >> http://spark.apache.org/
> >>>
> >>> The tag to be voted on is v3.2.0-rc5 (commit
> >>> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
> >>>
> >>> https://github.com/apache/spark/tree/v3.2.0-rc5
> >>>
> >>> The release files, including signatures, digests, etc. can be
> >> found
> >>> at:
> >>>
> >>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
> >>>
> >>> Signatures used for Spark RCs can be found in this file:
> >>>
> >>> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>>
> >>> The staging repository for this release can be found at:
> >>>
> >>>
> >>
> > https://repository.apache.org/content/repositories/orgapachespark-1392
> >>>
> >>> The documentation corresponding to this release can be found at:
> >>>
> >>> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
> >>>
> >>> The list of bug fixes going into 3.2.0 can be found at the
> >> following
> >>> URL:
> >>>
> >>> https://issues.apache.org/jira/projects/SPARK/versions/12349407
> >>>
> >>> This release is using the release script of the tag v3.2.0-rc5.
> >>>
> >>> FAQ
> >>>
> >>> =
> >>>
> >>> How can I help test this release?
> >>>
> >>> =
> >>>
> >>> If you are a Spark user, you can help us test this release by
> >> taking
> >>>
> >>> an existing Spark workload and running on this release candidate,
> >> then
> >>>
> >>> reporting any regressions.
> >>>
> >>> If you're working in PySpark you can set up a virtual env and
> >> install
> >>>
> >>> the current RC and see if anything important breaks, in the
> >> Java/Scala
> >>>
> >>> you can add the staging repository to your projects resolvers and
> >> test
> >>>
> >>> with the RC (make sure to clean up the artifact cache before/after
> >> so
> >>>
> >>> you don't end up building with a out of date RC going forward).
> >>>
> >>> ===
> >>>
> >>> What should happen to JIRA tickets still targeting 3.2.0?
> >>>
> >>> ===
> >>>
> >>> The current list of open tickets targeted at 3.2.0 can be found
> >> at:
> >>>
> >>> https://issues.apache.org/jira/projects/SPARK and search for
> >> "Target
> >>> Version/s" = 3.2.0
> >>>
> >>> Committers should look at those and triage. Extremely important
> >> bug
> >>>
> >>> fixes, documentation, and API tweaks that impact compatibility
> >> should
> >>>
> >>> be worked on immediately. Everything else please retarget to an
> >>>
> >>> appropriate release.
> >>>
> >>> ==
> >>>
> >>> But my bug isn't fixed?
> >>>
> >>> ==
> >>>
> >>> In order to make timely releases, we will typically not hold the
> >>>
> >>> release unless the bug in question is a regression from the
> >> previous
> >>>
> >>> release. That being said, if there is something which is a
> >> regression
> >>>
> >>> that has not been correctly targeted please ping me or a committer
> >> to
> >>>
> >>> help target the issue.
> >>
> >>
> > -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread sarutak

I think it affects devs but there are some workarounds.
So, if you all don't think it's necessary to meet 3.2.0, I'm OK not to 
do it.


- Kousuke


Hm... it does just affect Mac OS (?) and only if you don't have
JAVA_HOME set (which people often do set) and only affects build/mvn,
vs built-in maven (which people often have installed). Only affects
those building. I'm on the fence about whether it blocks 3.2.0, as it
doesn't affect downstream users and is easily resolvable.

On Mon, Sep 27, 2021 at 10:26 AM sarutak 
wrote:


Hi All,

SPARK-35887 seems to have introduced another issue that building
with
build/mvn on macOS stucks, and SPARK-36856 will resolve this issue.
Should we meet the fix to 3.2.0?

- Kousuke


Please vote on releasing the following candidate as Apache Spark
version 3.2.0.

The vote is open until 11:59pm Pacific time September 29 and

passes if

a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.2.0

[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see

http://spark.apache.org/


The tag to be voted on is v3.2.0-rc5 (commit
49aea14c5afd93ae1b9d19b661cc273a557853f5):

https://github.com/apache/spark/tree/v3.2.0-rc5

The release files, including signatures, digests, etc. can be

found

at:

https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/

Signatures used for Spark RCs can be found in this file:

https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:





https://repository.apache.org/content/repositories/orgapachespark-1392


The documentation corresponding to this release can be found at:

https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/

The list of bug fixes going into 3.2.0 can be found at the

following

URL:

https://issues.apache.org/jira/projects/SPARK/versions/12349407

This release is using the release script of the tag v3.2.0-rc5.

FAQ

=

How can I help test this release?

=

If you are a Spark user, you can help us test this release by

taking


an existing Spark workload and running on this release candidate,

then


reporting any regressions.

If you're working in PySpark you can set up a virtual env and

install


the current RC and see if anything important breaks, in the

Java/Scala


you can add the staging repository to your projects resolvers and

test


with the RC (make sure to clean up the artifact cache before/after

so


you don't end up building with a out of date RC going forward).

===

What should happen to JIRA tickets still targeting 3.2.0?

===

The current list of open tickets targeted at 3.2.0 can be found

at:


https://issues.apache.org/jira/projects/SPARK and search for

"Target

Version/s" = 3.2.0

Committers should look at those and triage. Extremely important

bug


fixes, documentation, and API tweaks that impact compatibility

should


be worked on immediately. Everything else please retarget to an

appropriate release.

==

But my bug isn't fixed?

==

In order to make timely releases, we will typically not hold the

release unless the bug in question is a regression from the

previous


release. That being said, if there is something which is a

regression


that has not been correctly targeted please ping me or a committer

to


help target the issue.




-

To unsubscribe e-mail: dev-unsubscr...@spark.apache.org


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Sean Owen
Hm... it does just affect Mac OS (?) and only if you don't have JAVA_HOME
set (which people often do set) and only affects build/mvn, vs built-in
maven (which people often have installed). Only affects those building. I'm
on the fence about whether it blocks 3.2.0, as it doesn't affect downstream
users and is easily resolvable.

On Mon, Sep 27, 2021 at 10:26 AM sarutak  wrote:

> Hi All,
>
> SPARK-35887 seems to have introduced another issue that building with
> build/mvn on macOS stucks, and SPARK-36856 will resolve this issue.
> Should we meet the fix to 3.2.0?
>
> - Kousuke
>
> > Please vote on releasing the following candidate as Apache Spark
> > version 3.2.0.
> >
> > The vote is open until 11:59pm Pacific time September 29 and passes if
> > a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 3.2.0
> >
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v3.2.0-rc5 (commit
> > 49aea14c5afd93ae1b9d19b661cc273a557853f5):
> >
> > https://github.com/apache/spark/tree/v3.2.0-rc5
> >
> > The release files, including signatures, digests, etc. can be found
> > at:
> >
> > https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> >
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> >
> > https://repository.apache.org/content/repositories/orgapachespark-1392
> >
> > The documentation corresponding to this release can be found at:
> >
> > https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
> >
> > The list of bug fixes going into 3.2.0 can be found at the following
> > URL:
> >
> > https://issues.apache.org/jira/projects/SPARK/versions/12349407
> >
> > This release is using the release script of the tag v3.2.0-rc5.
> >
> > FAQ
> >
> > =
> >
> > How can I help test this release?
> >
> > =
> >
> > If you are a Spark user, you can help us test this release by taking
> >
> > an existing Spark workload and running on this release candidate, then
> >
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> >
> > the current RC and see if anything important breaks, in the Java/Scala
> >
> > you can add the staging repository to your projects resolvers and test
> >
> > with the RC (make sure to clean up the artifact cache before/after so
> >
> > you don't end up building with a out of date RC going forward).
> >
> > ===
> >
> > What should happen to JIRA tickets still targeting 3.2.0?
> >
> > ===
> >
> > The current list of open tickets targeted at 3.2.0 can be found at:
> >
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> > Version/s" = 3.2.0
> >
> > Committers should look at those and triage. Extremely important bug
> >
> > fixes, documentation, and API tweaks that impact compatibility should
> >
> > be worked on immediately. Everything else please retarget to an
> >
> > appropriate release.
> >
> > ==
> >
> > But my bug isn't fixed?
> >
> > ==
> >
> > In order to make timely releases, we will typically not hold the
> >
> > release unless the bug in question is a regression from the previous
> >
> > release. That being said, if there is something which is a regression
> >
> > that has not been correctly targeted please ping me or a committer to
> >
> > help target the issue.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread sarutak

Hi All,

SPARK-35887 seems to have introduced another issue that building with 
build/mvn on macOS stucks, and SPARK-36856 will resolve this issue.

Should we meet the fix to 3.2.0?

- Kousuke


Please vote on releasing the following candidate as Apache Spark
version 3.2.0.

The vote is open until 11:59pm Pacific time September 29 and passes if
a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.2.0

[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.2.0-rc5 (commit
49aea14c5afd93ae1b9d19b661cc273a557853f5):

https://github.com/apache/spark/tree/v3.2.0-rc5

The release files, including signatures, digests, etc. can be found
at:

https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/

Signatures used for Spark RCs can be found in this file:

https://dist.apache.org/repos/dist/dev/spark/KEYS

The staging repository for this release can be found at:

https://repository.apache.org/content/repositories/orgapachespark-1392

The documentation corresponding to this release can be found at:

https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/

The list of bug fixes going into 3.2.0 can be found at the following
URL:

https://issues.apache.org/jira/projects/SPARK/versions/12349407

This release is using the release script of the tag v3.2.0-rc5.

FAQ

=

How can I help test this release?

=

If you are a Spark user, you can help us test this release by taking

an existing Spark workload and running on this release candidate, then

reporting any regressions.

If you're working in PySpark you can set up a virtual env and install

the current RC and see if anything important breaks, in the Java/Scala

you can add the staging repository to your projects resolvers and test

with the RC (make sure to clean up the artifact cache before/after so

you don't end up building with a out of date RC going forward).

===

What should happen to JIRA tickets still targeting 3.2.0?

===

The current list of open tickets targeted at 3.2.0 can be found at:

https://issues.apache.org/jira/projects/SPARK and search for "Target
Version/s" = 3.2.0

Committers should look at those and triage. Extremely important bug

fixes, documentation, and API tweaks that impact compatibility should

be worked on immediately. Everything else please retarget to an

appropriate release.

==

But my bug isn't fixed?

==

In order to make timely releases, we will typically not hold the

release unless the bug in question is a regression from the previous

release. That being said, if there is something which is a regression

that has not been correctly targeted please ping me or a committer to

help target the issue.


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Spark 3.2.0 (RC5)

2021-09-27 Thread Gengliang Wang
Starting with my +1(non-binding)

Thanks,
Gengliang

On Mon, Sep 27, 2021 at 8:55 PM Gengliang Wang  wrote:

> Please vote on releasing the following candidate as
> Apache Spark version 3.2.0.
>
> The vote is open until 11:59pm Pacific time September 29 and passes if a
> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.2.0-rc5 (commit
> 49aea14c5afd93ae1b9d19b661cc273a557853f5):
> https://github.com/apache/spark/tree/v3.2.0-rc5
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1392
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.0-rc5-docs/
>
> The list of bug fixes going into 3.2.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12349407
>
> This release is using the release script of the tag v3.2.0-rc5.
>
>
> FAQ
>
> =
> How can I help test this release?
> =
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.2.0?
> ===
> The current list of open tickets targeted at 3.2.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.2.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>