Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-18 Thread Chao Sun
+1 (non-binding) myself. Thanks everyone for voting!

On Wed, Nov 16, 2022 at 9:22 PM 416161...@qq.com 
wrote:

> +1
>
> --
> Ruifeng Zheng
> ruife...@foxmail.com
>
> <https://wx.mail.qq.com/home/index?t=readmail_businesscard_midpage=true=Ruifeng+Zheng=https%3A%2F%2Fres.mail.qq.com%2Fzh_CN%2Fhtmledition%2Fimages%2Frss%2Fmale.gif%3Frand%3D1617349242=ruifengz%40foxmail.com=>
>
>
>
> -- Original --
> *From:* "Wenchen Fan" ;
> *Date:* Thu, Nov 17, 2022 10:26 AM
> *To:* "Yang,Jie(INF)";
> *Cc:* "Chris Nauroth";"Yuming 
> Wang";"Dongjoon
> Hyun";"huaxin gao";"L.
> C. Hsieh";"Chao Sun";"dev"<
> dev@spark.apache.org>;
> *Subject:* Re: [VOTE] Release Spark 3.2.3 (RC1)
>
> +1
>
> On Thu, Nov 17, 2022 at 10:20 AM Yang,Jie(INF) 
> wrote:
>
>> +1,non-binding
>>
>>
>>
>> The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has
>> passed.
>>
>>
>>
>> Yang Jie
>>
>>
>>
>> *发件人**: *Chris Nauroth 
>> *日期**: *2022年11月17日 星期四 04:27
>> *收件人**: *Yuming Wang 
>> *抄送**: *"Yang,Jie(INF)" , Dongjoon Hyun <
>> dongjoon.h...@gmail.com>, huaxin gao , "L. C.
>> Hsieh" , Chao Sun , dev <
>> dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1 (non-binding)
>>
>> * Verified all checksums.
>> * Verified all signatures.
>> * Built from source, with multiple profiles, to full success, for Java 11
>> and Scala 2.12:
>> * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3
>> -Phive-thriftserver -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests
>> clean package
>> * Tests passed.
>> * Ran several examples successfully:
>> * bin/spark-submit --class org.apache.spark.examples.SparkPi
>> examples/jars/spark-examples_2.12-3.2.3.jar
>> * bin/spark-submit --class
>> org.apache.spark.examples.sql.hive.SparkHiveExample
>> examples/jars/spark-examples_2.12-3.2.3.jar
>> * bin/spark-submit
>> examples/src/main/python/streaming/network_wordcount.py localhost 
>>
>>
>>
>> Chao, thank you for preparing the release.
>>
>>
>>
>> Chris Nauroth
>>
>>
>>
>>
>>
>> On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang  wrote:
>>
>> +1
>>
>>
>>
>> On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) 
>> wrote:
>>
>> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
>> and it has not been hung.
>>
>>
>>
>> Yang Jie
>>
>>
>>
>> *发件人**: *Dongjoon Hyun 
>> *日期**: *2022年11月16日 星期三 01:17
>> *收件人**: *"Yang,Jie(INF)" 
>> *抄送**: *huaxin gao , "L. C. Hsieh" <
>> vii...@gmail.com>, Chao Sun , dev <
>> dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> Did you hit that in Scala 2.12, too?
>>
>>
>>
>> Dongjoon.
>>
>>
>>
>> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) 
>> wrote:
>>
>> Hi, all
>>
>>
>>
>> I test v3.2.3 with following command:
>>
>>
>>
>> ```
>>
>> dev/change-scala-version.sh 2.13
>>
>> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
>> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
>> -Pscala-2.13 -fn
>>
>> ```
>>
>>
>>
>> The testing environment is:
>>
>>
>>
>> OS: CentOS 6u3 Final
>>
>> Java: zulu 11.0.17
>>
>> Python: 3.9.7
>>
>> Scala: 2.13
>>
>>
>>
>> The above test command has been executed twice, and all times hang in the
>> following stack:
>>
>>
>>
>> ```
>>
>> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
>> elapsed=1552.65s tid=0x7f2ddc02d000 nid=0x7132 waiting on condition
>> [0x7f2de3929000]
>>
>>java.lang.Thread.State: WAITING (parking)
>>
>>at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>>
>>- parking to wait for  <0x000790d00050> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>
>>at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
>> /LockSupport.java:194)
>>
>>at

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-16 Thread 416161...@qq.com
+1




RuifengZheng
ruife...@foxmail.com








--Original--
From:   
 "Wenchen Fan"  
  
http://spark.apache.org/
 
  The tag to be voted on is v3.2.3-rc1 (commit
  b53c341e0fefbb33d115ab630369a18765b7763d):
   https://github.com/apache/spark/tree/v3.2.3-rc1
 
  The release files, including signatures, digests, etc. can be found 
at:
   https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
 
  Signatures used for Spark RCs can be found in this file:
   https://dist.apache.org/repos/dist/dev/spark/KEYS
 
  The staging repository for this release can be found at:
   
https://repository.apache.org/content/repositories/orgapachespark-1431/
 
  The documentation corresponding to this release can be found at:
   https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
 
  The list of bug fixes going into 3.2.3 can be found at the following 
URL:
   https://issues.apache.org/jira/projects/SPARK/versions/12352105
 
  This release is using the release script of the tag v3.2.3-rc1.
 
 
  FAQ
 
  =
  How can I help test this release?
  =
  If you are a Spark user, you can help us test this release by taking
  an existing Spark workload and running on this release candidate, then
  reporting any regressions.
 
  If you're working in PySpark you can set up a virtual env and install
  the current RC and see if anything important breaks, in the Java/Scala
  you can add the staging repository to your projects resolvers and test
  with the RC (make sure to clean up the artifact cache before/after so
  you don't end up building with a out of date RC going forward).
 
  ===
  What should happen to JIRA tickets still targeting 3.2.3?
  ===
  The current list of open tickets targeted at 3.2.3 can be found at:
   https://issues.apache.org/jira/projects/SPARK and search for "Target
  Version/s" = 3.2.3
 
  Committers should look at those and triage. Extremely important bug
  fixes, documentation, and API tweaks that impact compatibility should
  be worked on immediately. Everything else please retarget to an
  appropriate release.
 
  ==
  But my bug isn't fixed?
  ==
  In order to make timely releases, we will typically not hold the
  release unless the bug in question is a regression from the previous
  release. That being said, if there is something which is a regression
  that has not been correctly targeted please ping me or a committer to
  help target the issue.
 
  -
  To unsubscribe e-mail:  dev-unsubscr...@spark.apache.org
 
 
 -
 To unsubscribe e-mail:  dev-unsubscr...@spark.apache.org

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-16 Thread Wenchen Fan
+1

On Thu, Nov 17, 2022 at 10:20 AM Yang,Jie(INF)  wrote:

> +1,non-binding
>
>
>
> The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has
> passed.
>
>
>
> Yang Jie
>
>
>
> *发件人**: *Chris Nauroth 
> *日期**: *2022年11月17日 星期四 04:27
> *收件人**: *Yuming Wang 
> *抄送**: *"Yang,Jie(INF)" , Dongjoon Hyun <
> dongjoon.h...@gmail.com>, huaxin gao , "L. C.
> Hsieh" , Chao Sun , dev <
> dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1 (non-binding)
>
> * Verified all checksums.
> * Verified all signatures.
> * Built from source, with multiple profiles, to full success, for Java 11
> and Scala 2.12:
> * build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver
> -Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
> * Tests passed.
> * Ran several examples successfully:
> * bin/spark-submit --class org.apache.spark.examples.SparkPi
> examples/jars/spark-examples_2.12-3.2.3.jar
> * bin/spark-submit --class
> org.apache.spark.examples.sql.hive.SparkHiveExample
> examples/jars/spark-examples_2.12-3.2.3.jar
> * bin/spark-submit
> examples/src/main/python/streaming/network_wordcount.py localhost 
>
>
>
> Chao, thank you for preparing the release.
>
>
>
> Chris Nauroth
>
>
>
>
>
> On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang  wrote:
>
> +1
>
>
>
> On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF)  wrote:
>
> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
> and it has not been hung.
>
>
>
> Yang Jie
>
>
>
> *发件人**: *Dongjoon Hyun 
> *日期**: *2022年11月16日 星期三 01:17
> *收件人**: *"Yang,Jie(INF)" 
> *抄送**: *huaxin gao , "L. C. Hsieh" <
> vii...@gmail.com>, Chao Sun , dev <
> dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> Did you hit that in Scala 2.12, too?
>
>
>
> Dongjoon.
>
>
>
> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF)  wrote:
>
> Hi, all
>
>
>
> I test v3.2.3 with following command:
>
>
>
> ```
>
> dev/change-scala-version.sh 2.13
>
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> -Pscala-2.13 -fn
>
> ```
>
>
>
> The testing environment is:
>
>
>
> OS: CentOS 6u3 Final
>
> Java: zulu 11.0.17
>
> Python: 3.9.7
>
> Scala: 2.13
>
>
>
> The above test command has been executed twice, and all times hang in the
> following stack:
>
>
>
> ```
>
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
> elapsed=1552.65s tid=0x7f2ddc02d000 nid=0x7132 waiting on condition
> [0x7f2de3929000]
>
>java.lang.Thread.State: WAITING (parking)
>
>at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>
>- parking to wait for  <0x000790d00050> (a
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>
>at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
> /LockSupport.java:194)
>
>at
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
> /AbstractQueuedSynchronizer.java:2081)
>
>at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
> /LinkedBlockingQueue.java:433)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x000802269840.apply(Unknown
> Source)
>
>at
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>
>- locked <0x000790d00208> (a java.lang.Object)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>
>at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>
>at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x000801f99c40.apply(Unknown
> Source)
>
>at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>
>at
> org.apache.spar

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-16 Thread Yang,Jie(INF)
+1,non-binding

The test combination of Java 11 + Scala 2.12 and Java 11 + Scala 2.13 has 
passed.

Yang Jie


发件人: Chris Nauroth 
日期: 2022年11月17日 星期四 04:27
收件人: Yuming Wang 
抄送: "Yang,Jie(INF)" , Dongjoon Hyun 
, huaxin gao , "L. C. Hsieh" 
, Chao Sun , dev 
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

+1 (non-binding)

* Verified all checksums.
* Verified all signatures.
* Built from source, with multiple profiles, to full success, for Java 11 and 
Scala 2.12:
* build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver 
-Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
* Tests passed.
* Ran several examples successfully:
* bin/spark-submit --class org.apache.spark.examples.SparkPi 
examples/jars/spark-examples_2.12-3.2.3.jar
* bin/spark-submit --class 
org.apache.spark.examples.sql.hive.SparkHiveExample 
examples/jars/spark-examples_2.12-3.2.3.jar
* bin/spark-submit examples/src/main/python/streaming/network_wordcount.py 
localhost 

Chao, thank you for preparing the release.

Chris Nauroth


On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang 
mailto:wgy...@gmail.com>> wrote:
+1

On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF) 
mailto:yangji...@baidu.com>> wrote:
I switched Scala 2.13 to Scala 2.12 today. The test is still in progress and it 
has not been hung.

Yang Jie

发件人: Dongjoon Hyun mailto:dongjoon.h...@gmail.com>>
日期: 2022年11月16日 星期三 01:17
收件人: "Yang,Jie(INF)" mailto:yangji...@baidu.com>>
抄送: huaxin gao mailto:huaxin.ga...@gmail.com>>, "L. C. 
Hsieh" mailto:vii...@gmail.com>>, Chao Sun 
mailto:sunc...@apache.org>>, dev 
mailto:dev@spark.apache.org>>
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

Did you hit that in Scala 2.12, too?

Dongjoon.

On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) 
mailto:yangji...@baidu.com>> wrote:
Hi, all

I test v3.2.3 with following command:

```

dev/change-scala-version.sh 2.13
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl 
-Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive  -Pscala-2.13 -fn
```

The testing environment is:

OS: CentOS 6u3 Final
Java: zulu 11.0.17
Python: 3.9.7
Scala: 2.13

The above test command has been executed twice, and all times hang in the 
following stack:

```
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms 
elapsed=1552.65s tid=0x7f2ddc02d000 nid=0x7132 waiting on condition  
[0x7f2de3929000]
   java.lang.Thread.State: WAITING (parking)
   at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
   - parking to wait for  <0x000790d00050> (a 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
   at 
java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
   at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
   at 
java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x000802269840.apply(Unknown
 Source)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
   - locked <0x000790d00208> (a java.lang.Object)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
   at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
   at 
org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x000801f99c40.apply(Unknown
 Source)
   at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
   at 
org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x000801f9a040.apply(Unknown
 Source)
   at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
   at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
   at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
   - locked <0x000790d00218> (a 
org.apache.spark.sql.execution.QueryExecution)
   at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
   at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
   - locked <0x000790d002d8> (a org.apache.spark.sql.Dataset)
   at org.apa

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-16 Thread Chris Nauroth
+1 (non-binding)

* Verified all checksums.
* Verified all signatures.
* Built from source, with multiple profiles, to full success, for Java 11
and Scala 2.12:
* build/mvn -Phadoop-3.2 -Phadoop-cloud -Phive-2.3 -Phive-thriftserver
-Pkubernetes -Pscala-2.12 -Psparkr -Pyarn -DskipTests clean package
* Tests passed.
* Ran several examples successfully:
* bin/spark-submit --class org.apache.spark.examples.SparkPi
examples/jars/spark-examples_2.12-3.2.3.jar
* bin/spark-submit --class
org.apache.spark.examples.sql.hive.SparkHiveExample
examples/jars/spark-examples_2.12-3.2.3.jar
* bin/spark-submit
examples/src/main/python/streaming/network_wordcount.py localhost 

Chao, thank you for preparing the release.

Chris Nauroth


On Wed, Nov 16, 2022 at 5:22 AM Yuming Wang  wrote:

> +1
>
> On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF)  wrote:
>
>> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
>> and it has not been hung.
>>
>>
>>
>> Yang Jie
>>
>>
>>
>> *发件人**: *Dongjoon Hyun 
>> *日期**: *2022年11月16日 星期三 01:17
>> *收件人**: *"Yang,Jie(INF)" 
>> *抄送**: *huaxin gao , "L. C. Hsieh" <
>> vii...@gmail.com>, Chao Sun , dev <
>> dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> Did you hit that in Scala 2.12, too?
>>
>>
>>
>> Dongjoon.
>>
>>
>>
>> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) 
>> wrote:
>>
>> Hi, all
>>
>>
>>
>> I test v3.2.3 with following command:
>>
>>
>>
>> ```
>>
>> dev/change-scala-version.sh 2.13
>>
>> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
>> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
>> -Pscala-2.13 -fn
>>
>> ```
>>
>>
>>
>> The testing environment is:
>>
>>
>>
>> OS: CentOS 6u3 Final
>>
>> Java: zulu 11.0.17
>>
>> Python: 3.9.7
>>
>> Scala: 2.13
>>
>>
>>
>> The above test command has been executed twice, and all times hang in the
>> following stack:
>>
>>
>>
>> ```
>>
>> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
>> elapsed=1552.65s tid=0x7f2ddc02d000 nid=0x7132 waiting on condition
>> [0x7f2de3929000]
>>
>>java.lang.Thread.State: WAITING (parking)
>>
>>at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>>
>>- parking to wait for  <0x000790d00050> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>
>>at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
>> /LockSupport.java:194)
>>
>>at
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
>> /AbstractQueuedSynchronizer.java:2081)
>>
>>at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
>> /LinkedBlockingQueue.java:433)
>>
>>at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>>
>>at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x000802269840.apply(Unknown
>> Source)
>>
>>at
>> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>>
>>at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>>
>>- locked <0x000790d00208> (a java.lang.Object)
>>
>>at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>>
>>at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>>
>>at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>>
>>at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x000801f99c40.apply(Unknown
>> Source)
>>
>>at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>>
>>at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x000801f9a040.apply(Unknown
>> Source)
>>
>>at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>
>>at
>> or

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-16 Thread Yuming Wang
+1

On Wed, Nov 16, 2022 at 2:28 PM Yang,Jie(INF)  wrote:

> I switched Scala 2.13 to Scala 2.12 today. The test is still in progress
> and it has not been hung.
>
>
>
> Yang Jie
>
>
>
> *发件人**: *Dongjoon Hyun 
> *日期**: *2022年11月16日 星期三 01:17
> *收件人**: *"Yang,Jie(INF)" 
> *抄送**: *huaxin gao , "L. C. Hsieh" <
> vii...@gmail.com>, Chao Sun , dev <
> dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> Did you hit that in Scala 2.12, too?
>
>
>
> Dongjoon.
>
>
>
> On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF)  wrote:
>
> Hi, all
>
>
>
> I test v3.2.3 with following command:
>
>
>
> ```
>
> dev/change-scala-version.sh 2.13
>
> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> -Pscala-2.13 -fn
>
> ```
>
>
>
> The testing environment is:
>
>
>
> OS: CentOS 6u3 Final
>
> Java: zulu 11.0.17
>
> Python: 3.9.7
>
> Scala: 2.13
>
>
>
> The above test command has been executed twice, and all times hang in the
> following stack:
>
>
>
> ```
>
> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
> elapsed=1552.65s tid=0x7f2ddc02d000 nid=0x7132 waiting on condition
> [0x7f2de3929000]
>
>java.lang.Thread.State: WAITING (parking)
>
>at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>
>- parking to wait for  <0x000790d00050> (a
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>
>at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
> /LockSupport.java:194)
>
>at
> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
> /AbstractQueuedSynchronizer.java:2081)
>
>at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
> /LinkedBlockingQueue.java:433)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x000802269840.apply(Unknown
> Source)
>
>at
> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>
>- locked <0x000790d00208> (a java.lang.Object)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>
>at
> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>
>at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>
>at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x000801f99c40.apply(Unknown
> Source)
>
>at
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>
>at
> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x000801f9a040.apply(Unknown
> Source)
>
>at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>
>at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>
>at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>
>at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>
>- locked <0x000790d00218> (a
> org.apache.spark.sql.execution.QueryExecution)
>
>at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>
>at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>
>- locked <0x000790d002d8> (a org.apache.spark.sql.Dataset)
>
>at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>
>at
> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>
>at
> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x000801f94440.apply$mcJ$sp(Unknown
> Source)
>
>at
> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>
>at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>
>at
> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>
>at org

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-15 Thread Yang,Jie(INF)
I switched Scala 2.13 to Scala 2.12 today. The test is still in progress and it 
has not been hung.

Yang Jie

发件人: Dongjoon Hyun 
日期: 2022年11月16日 星期三 01:17
收件人: "Yang,Jie(INF)" 
抄送: huaxin gao , "L. C. Hsieh" , Chao 
Sun , dev 
主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

Did you hit that in Scala 2.12, too?

Dongjoon.

On Tue, Nov 15, 2022 at 4:36 AM Yang,Jie(INF) 
mailto:yangji...@baidu.com>> wrote:
Hi, all

I test v3.2.3 with following command:

```

dev/change-scala-version.sh 2.13
build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl 
-Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive  -Pscala-2.13 -fn
```

The testing environment is:

OS: CentOS 6u3 Final
Java: zulu 11.0.17
Python: 3.9.7
Scala: 2.13

The above test command has been executed twice, and all times hang in the 
following stack:

```
"ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms 
elapsed=1552.65s tid=0x7f2ddc02d000 nid=0x7132 waiting on condition  
[0x7f2de3929000]
   java.lang.Thread.State: WAITING (parking)
   at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
   - parking to wait for  <0x000790d00050> (a 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
   at 
java.util.concurrent.locks.LockSupport.park(java.base@11.0.17/LockSupport.java:194)
   at 
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17/AbstractQueuedSynchronizer.java:2081)
   at 
java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17/LinkedBlockingQueue.java:433)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x000802269840.apply(Unknown
 Source)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
   - locked <0x000790d00208> (a java.lang.Object)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
   at 
org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
   at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
   at 
org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x000801f99c40.apply(Unknown
 Source)
   at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
   at 
org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x000801f9a040.apply(Unknown
 Source)
   at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
   at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
   at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
   - locked <0x000790d00218> (a 
org.apache.spark.sql.execution.QueryExecution)
   at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
   at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
   - locked <0x000790d002d8> (a org.apache.spark.sql.Dataset)
   at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
   at 
org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
   at 
org.apache.spark.sql.QueryTest$$$Lambda$8564/0x000801f94440.apply$mcJ$sp(Unknown
 Source)
   at 
scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
   at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   at 
org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
   at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
   at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
   at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
   at org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
   at 
org.apache.spark.sql.JoinSuite$$Lambda$2827/0x0008013d5840.apply$mcV$sp(Unknown
 Source)
   at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
   at org.scalatest.Transformer.apply(Transformer.scala:22)
   at org.scalatest.Transformer.apply(Transformer.scala:20)
   at 
org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
   at org.ap

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-15 Thread Mridul Muralidharan
alatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>>
>>at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x000801f07840.apply(Unknown
>> Source)
>>
>>at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>>
>>at
>> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>>
>>at
>> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>>
>>at org.apache.spark.SparkFunSuite.org
>> <http://org.apache.spark.sparkfunsuite.org/>
>> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>>
>>at
>> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>>
>>at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>>
>>at
>> org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>>
>>at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>>
>>at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>>
>>at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>>
>>at
>> org.scalatest.Suite$$Lambda$7247/0x00080193d040.apply(Unknown Source)
>>
>>at
>> scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>>
>>at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>>
>>at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>>
>>at
>> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>>
>>at org.scalatest.Suite.run(Suite.scala:1109)
>>
>>at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>>
>>at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>>
>>at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>>
>>at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>>
>>at
>> org.scalatest.tools.Runner$$$Lambda$7245/0x00080193e840.apply(Unknown
>> Source)
>>
>>at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>at
>> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>>
>>at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>>
>>at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>>
>>at
>> org.scalatest.tools.Runner$$$Lambda$60/0x000800148040.apply(Unknown
>> Source)
>>
>>at
>> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>>
>>at
>> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>>
>>at org.scalatest.tools.Runner$.main(Runner.scala:775)
>>
>>at org.scalatest.tools.Runner.main(Runner.scala)
>>
>> ```
>>
>> I think the test case being executed is `SPARK-28323: PythonUDF should be
>> able to use in join condition`, does anyone have the same problem?
>>
>>
>>
>> Yang Jie
>>
>>
>>
>>
>>
>> *发件人**: *huaxin gao 
>> *日期**: *2022年11月15日 星期二 13:59
>> *收件人**: *"L. C. Hsieh" 
>> *抄送**: *Dongjoon Hyun , Chao Sun <
>> sunc...@apache.org>, dev 
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1
>>
>>
>>
>> Thanks Chao!
>>
>>
>>
>> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh  wrote:
>>
>> +1
>>
>> Thanks Chao.
>>
>> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun 
>> wrote:
>> >
>> > +1
>> >
>> > Thank you, Chao.
>> >
>> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 3.2.3.
>> >>
>> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 3.2.3
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see http://spark.apache.org/
>> <https://mailshield.baidu.com/c

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-15 Thread kazuyuki tanimura
scala:1175)
> 
>at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
> 
>at org.scalatest.Suite$$Lambda$7247/0x00080193d040.apply(Unknown 
> Source)
> 
>at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
> 
>at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
> 
>at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
> 
>at 
> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
> 
>at org.scalatest.Suite.run(Suite.scala:1109)
> 
>at org.scalatest.Suite.run$(Suite.scala:1094)
> 
>at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
> 
>at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
> 
>at 
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
> 
>at 
> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
> 
>at 
> org.scalatest.tools.Runner$$$Lambda$7245/0x00080193e840.apply(Unknown 
> Source)
> 
>at scala.collection.immutable.List.foreach(List.scala:333)
> 
>at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
> 
>at 
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
> 
>at 
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
> 
>at 
> org.scalatest.tools.Runner$$$Lambda$60/0x000800148040.apply(Unknown 
> Source)
> 
>at 
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
> 
>at 
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
> 
>at org.scalatest.tools.Runner$.main(Runner.scala:775)
> 
>at org.scalatest.tools.Runner.main(Runner.scala)
> 
> ```
> 
> I think the test case being executed is `SPARK-28323: PythonUDF should be 
> able to use in join condition`, does anyone have the same problem?
> 
>  
> 
> Yang Jie
> 
>  
> 
>  
> 
> 发件人: huaxin gao mailto:huaxin.ga...@gmail.com>>
> 日期: 2022年11月15日 星期二 13:59
> 收件人: "L. C. Hsieh" mailto:vii...@gmail.com>>
> 抄送: Dongjoon Hyun mailto:dongjoon.h...@gmail.com>>, 
> Chao Sun mailto:sunc...@apache.org>>, dev 
> mailto:dev@spark.apache.org>>
> 主题: Re: [VOTE] Release Spark 3.2.3 (RC1)
> 
>  
> 
> +1 
> 
>  
> 
> Thanks Chao!
> 
>  
> 
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh  <mailto:vii...@gmail.com>> wrote:
> 
> +1
> 
> Thanks Chao.
> 
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun  <mailto:dongjoon.h...@gmail.com>> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  > <mailto:sunc...@apache.org>> wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark version 
> >> 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/ 
> >> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1 
> >> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/ 
> >> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS 
> >> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/ 
> >> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-15 Thread Sean Owen
.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>
>at
> org.scalatest.tools.Runner$$$Lambda$7245/0x00080193e840.apply(Unknown
> Source)
>
>at scala.collection.immutable.List.foreach(List.scala:333)
>
>at
> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>
>at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>
>at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>
>at
> org.scalatest.tools.Runner$$$Lambda$60/0x000800148040.apply(Unknown
> Source)
>
>at
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>
>at
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>
>at org.scalatest.tools.Runner$.main(Runner.scala:775)
>
>at org.scalatest.tools.Runner.main(Runner.scala)
>
> ```
>
> I think the test case being executed is `SPARK-28323: PythonUDF should be
> able to use in join condition`, does anyone have the same problem?
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *huaxin gao 
> *日期**: *2022年11月15日 星期二 13:59
> *收件人**: *"L. C. Hsieh" 
> *抄送**: *Dongjoon Hyun , Chao Sun <
> sunc...@apache.org>, dev 
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1
>
>
>
> Thanks Chao!
>
>
>
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh  wrote:
>
> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun 
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =
> >> How can I help test this release?
> >> =
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anyt

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-15 Thread Dongjoon Hyun
00080193e840.apply(Unknown
> Source)
>
>at scala.collection.immutable.List.foreach(List.scala:333)
>
>at
> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>
>at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>
>at
> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>
>at
> org.scalatest.tools.Runner$$$Lambda$60/0x000800148040.apply(Unknown
> Source)
>
>at
> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>
>at
> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>
>at org.scalatest.tools.Runner$.main(Runner.scala:775)
>
>at org.scalatest.tools.Runner.main(Runner.scala)
>
> ```
>
> I think the test case being executed is `SPARK-28323: PythonUDF should be
> able to use in join condition`, does anyone have the same problem?
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *huaxin gao 
> *日期**: *2022年11月15日 星期二 13:59
> *收件人**: *"L. C. Hsieh" 
> *抄送**: *Dongjoon Hyun , Chao Sun <
> sunc...@apache.org>, dev 
> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>
>
>
> +1
>
>
>
> Thanks Chao!
>
>
>
> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh  wrote:
>
> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun 
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =
> >> How can I help test this release?
> >> =
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK
> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
> and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==
> >> But my bug isn't fixed?
> >> ==
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-15 Thread Yang,Jie(INF)
ngine.runTestImpl(Engine.scala:306)
   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
   at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
   at 
org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
   at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
   at 
org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x000801f0e840.apply(Unknown
 Source)
   at 
org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
   at 
org.scalatest.SuperEngine$$Lambda$8383/0x000801f0d840.apply(Unknown Source)
   at scala.collection.immutable.List.foreach(List.scala:333)
   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
   at 
org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
   at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
   at org.scalatest.Suite.run(Suite.scala:1112)
   at org.scalatest.Suite.run$(Suite.scala:1094)
   at 
org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
   at 
org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
   at 
org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x000801f07840.apply(Unknown
 Source)
   at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
   at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
   at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
   at 
org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
   at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
   at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
   at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
   at org.scalatest.Suite$$Lambda$7247/0x00080193d040.apply(Unknown 
Source)
   at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
   at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
   at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
   at 
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
   at org.scalatest.Suite.run(Suite.scala:1109)
   at org.scalatest.Suite.run$(Suite.scala:1094)
   at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
   at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
   at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
   at 
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
   at 
org.scalatest.tools.Runner$$$Lambda$7245/0x00080193e840.apply(Unknown 
Source)
   at scala.collection.immutable.List.foreach(List.scala:333)
   at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
   at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
   at 
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
   at 
org.scalatest.tools.Runner$$$Lambda$60/0x000800148040.apply(Unknown Source)
   at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
   at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
   at org.scalatest.tools.Runner$.main(Runner.scala:775)
   at org.scalatest.tools.Runner.main(Runner.scala)
```
I think the test case being executed is `SPARK-28323: PythonUDF should be able 
to use in join condition`, does anyone have the same problem?

Yang Jie


发件人: huaxin gao 
日期: 2022年11月15日 星期二 13:59
收件人: "L. C. Hsieh" 
抄送: Dongjoon Hyun , Chao Sun , dev 

主题: Re: [VOTE] Release Spark 3.2.3 (RC1)

+1

Thanks Chao!

On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh 
mailto:vii...@gmail.com>> wrote:
+1

Thanks Chao.

On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun 
mailto:dongjoon.h...@gmail.com>> wrote:
>
> +1
>
> Thank you, Chao.
>
> On Mon, Nov 14, 2022 at 4:12 PM Chao Sun 
> mailto:sunc...@apache.org>> wrote:
>>
>> Please vote on releasing the following candidat

Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-14 Thread huaxin gao
+1

Thanks Chao!

On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh  wrote:

> +1
>
> Thanks Chao.
>
> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun 
> wrote:
> >
> > +1
> >
> > Thank you, Chao.
> >
> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  wrote:
> >>
> >> Please vote on releasing the following candidate as Apache Spark
> version 3.2.3.
> >>
> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>
> >> [ ] +1 Release this package as Apache Spark 3.2.3
> >> [ ] -1 Do not release this package because ...
> >>
> >> To learn more about Apache Spark, please see http://spark.apache.org/
> >>
> >> The tag to be voted on is v3.2.3-rc1 (commit
> >> b53c341e0fefbb33d115ab630369a18765b7763d):
> >> https://github.com/apache/spark/tree/v3.2.3-rc1
> >>
> >> The release files, including signatures, digests, etc. can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
> >>
> >> Signatures used for Spark RCs can be found in this file:
> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>
> >> The staging repository for this release can be found at:
> >> https://repository.apache.org/content/repositories/orgapachespark-1431/
> >>
> >> The documentation corresponding to this release can be found at:
> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
> >>
> >> The list of bug fixes going into 3.2.3 can be found at the following
> URL:
> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
> >>
> >> This release is using the release script of the tag v3.2.3-rc1.
> >>
> >>
> >> FAQ
> >>
> >> =
> >> How can I help test this release?
> >> =
> >> If you are a Spark user, you can help us test this release by taking
> >> an existing Spark workload and running on this release candidate, then
> >> reporting any regressions.
> >>
> >> If you're working in PySpark you can set up a virtual env and install
> >> the current RC and see if anything important breaks, in the Java/Scala
> >> you can add the staging repository to your projects resolvers and test
> >> with the RC (make sure to clean up the artifact cache before/after so
> >> you don't end up building with a out of date RC going forward).
> >>
> >> ===
> >> What should happen to JIRA tickets still targeting 3.2.3?
> >> ===
> >> The current list of open tickets targeted at 3.2.3 can be found at:
> >> https://issues.apache.org/jira/projects/SPARK and search for "Target
> >> Version/s" = 3.2.3
> >>
> >> Committers should look at those and triage. Extremely important bug
> >> fixes, documentation, and API tweaks that impact compatibility should
> >> be worked on immediately. Everything else please retarget to an
> >> appropriate release.
> >>
> >> ==
> >> But my bug isn't fixed?
> >> ==
> >> In order to make timely releases, we will typically not hold the
> >> release unless the bug in question is a regression from the previous
> >> release. That being said, if there is something which is a regression
> >> that has not been correctly targeted please ping me or a committer to
> >> help target the issue.
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-14 Thread L. C. Hsieh
+1

Thanks Chao.

On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun  wrote:
>
> +1
>
> Thank you, Chao.
>
> On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  wrote:
>>
>> Please vote on releasing the following candidate as Apache Spark version 
>> 3.2.3.
>>
>> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.2.3
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see http://spark.apache.org/
>>
>> The tag to be voted on is v3.2.3-rc1 (commit
>> b53c341e0fefbb33d115ab630369a18765b7763d):
>> https://github.com/apache/spark/tree/v3.2.3-rc1
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1431/
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>>
>> The list of bug fixes going into 3.2.3 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>>
>> This release is using the release script of the tag v3.2.3-rc1.
>>
>>
>> FAQ
>>
>> =
>> How can I help test this release?
>> =
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===
>> What should happen to JIRA tickets still targeting 3.2.3?
>> ===
>> The current list of open tickets targeted at 3.2.3 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.2.3
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==
>> But my bug isn't fixed?
>> ==
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Spark 3.2.3 (RC1)

2022-11-14 Thread Dongjoon Hyun
+1

Thank you, Chao.

On Mon, Nov 14, 2022 at 4:12 PM Chao Sun  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 3.2.3.
>
> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 3.2.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v3.2.3-rc1 (commit
> b53c341e0fefbb33d115ab630369a18765b7763d):
> https://github.com/apache/spark/tree/v3.2.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1431/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>
> The list of bug fixes going into 3.2.3 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>
> This release is using the release script of the tag v3.2.3-rc1.
>
>
> FAQ
>
> =
> How can I help test this release?
> =
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 3.2.3?
> ===
> The current list of open tickets targeted at 3.2.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.2.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>