+1

Signatures, digests, etc check out fine.
Checked out tag and build/tested with -Pyarn -Pmesos -Pkubernetes

Regards,
Mridul


On Tue, Nov 15, 2022 at 1:00 PM kazuyuki tanimura
<ktanim...@apple.com.invalid> wrote:

> +1 (non-binding)
>
> Thank you Chao
>
> Kazu
>
>
>  | Kazuyuki Tanimura | ktanim...@apple.com | +1-408-207-7176
>
> Apple Confidential and Proprietary Information
>
> This email and any attachments is privileged and contains confidential
> information intended only for the recipient(s) named above. Any
> other distribution, forwarding, copying or disclosure of this message is
> strictly prohibited. If you have received this email in error, please
> notify me immediately by telephone or return email, and delete this message
> from your system.
>
> On Nov 15, 2022, at 10:04 AM, Sean Owen <sro...@gmail.com> wrote:
>
> +1 from me, at least from my testing. Java 8 + Scala 2.12 and Java 8 +
> Scala 2.13 worked for me, and I didn't see a test hang. I am testing with
> Python 3.10 FWIW.
>
> On Tue, Nov 15, 2022 at 6:37 AM Yang,Jie(INF) <yangji...@baidu.com> wrote:
>
>> Hi, all
>>
>>
>>
>> I test v3.2.3 with following command:
>>
>>
>>
>> ```
>>
>> dev/change-scala-version.sh 2.13
>>
>> build/mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn
>> -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
>> -Pscala-2.13 -fn
>>
>> ```
>>
>>
>>
>> The testing environment is:
>>
>>
>>
>> OS: CentOS 6u3 Final
>>
>> Java: zulu 11.0.17
>>
>> Python: 3.9.7
>>
>> Scala: 2.13
>>
>>
>>
>> The above test command has been executed twice, and all times hang in the
>> following stack:
>>
>>
>>
>> ```
>>
>> "ScalaTest-main-running-JoinSuite" #1 prio=5 os_prio=0 cpu=312870.06ms
>> elapsed=1552.65s tid=0x00007f2ddc02d000 nid=0x7132 waiting on condition
>> [0x00007f2de3929000]
>>
>>    java.lang.Thread.State: WAITING (parking)
>>
>>        at jdk.internal.misc.Unsafe.park(java.base@11.0.17/Native Method)
>>
>>        - parking to wait for  <0x0000000790d00050> (a
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
>>
>>        at java.util.concurrent.locks.LockSupport.park(java.base@11.0.17
>> /LockSupport.java:194)
>>
>>        at
>> java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@11.0.17
>> /AbstractQueuedSynchronizer.java:2081)
>>
>>        at java.util.concurrent.LinkedBlockingQueue.take(java.base@11.0.17
>> /LinkedBlockingQueue.java:433)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:275)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$Lambda$9429/0x0000000802269840.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:228)
>>
>>        - locked <0x0000000790d00208> (a java.lang.Object)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:370)
>>
>>        at
>> org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:355)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8573/0x0000000801f99c40.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan$$Lambda$8574/0x0000000801f9a040.apply(Unknown
>> Source)
>>
>>        at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
>>
>>        at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:172)
>>
>>        - locked <0x0000000790d00218> (a
>> org.apache.spark.sql.execution.QueryExecution)
>>
>>        at
>> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:171)
>>
>>        at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:3247)
>>
>>        - locked <0x0000000790d002d8> (a org.apache.spark.sql.Dataset)
>>
>>        at org.apache.spark.sql.Dataset.rdd(Dataset.scala:3245)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.$anonfun$getErrorMessageInCheckAnswer$1(QueryTest.scala:265)
>>
>>        at
>> org.apache.spark.sql.QueryTest$$$Lambda$8564/0x0000000801f94440.apply$mcJ$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.scala:17)
>>
>>        at
>> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>>
>>        at
>> org.apache.spark.sql.QueryTest$.getErrorMessageInCheckAnswer(QueryTest.scala:265)
>>
>>        at org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:242)
>>
>>        at org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151)
>>
>>        at org.apache.spark.sql.JoinSuite.checkAnswer(JoinSuite.scala:58)
>>
>>        at
>> org.apache.spark.sql.JoinSuite.$anonfun$new$138(JoinSuite.scala:1062)
>>
>>        at
>> org.apache.spark.sql.JoinSuite$$Lambda$2827/0x00000008013d5840.apply$mcV$sp(Unknown
>> Source)
>>
>>        at
>> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
>>
>>        at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
>>
>>        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:22)
>>
>>        at org.scalatest.Transformer.apply(Transformer.scala:20)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
>>
>>        at
>> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:190)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8386/0x0000000801f0a840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <http://org.apache.spark.sparkfunsuite.org/>
>> $scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
>>
>>        at
>> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
>>
>>        at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8382/0x0000000801f0e840.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
>>
>>        at
>> org.scalatest.SuperEngine$$Lambda$8383/0x0000000801f0d840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>>
>>        at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
>>
>>        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1112)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.funsuite.AnyFunSuite.org
>> <http://org.scalatest.funsuite.anyfunsuite.org/>
>> $scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike$$Lambda$8376/0x0000000801f07840.apply(Unknown
>> Source)
>>
>>        at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
>>
>>        at
>> org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
>>
>>        at org.apache.spark.SparkFunSuite.org
>> <http://org.apache.spark.sparkfunsuite.org/>
>> $scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:62)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
>>
>>        at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
>>
>>        at
>> org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
>>
>>        at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
>>
>>        at org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
>>
>>        at org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
>>
>>        at
>> org.scalatest.Suite$$Lambda$7247/0x000000080193d040.apply(Unknown Source)
>>
>>        at
>> scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
>>
>>        at org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
>>
>>        at org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
>>
>>        at
>> org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.Suite.run(Suite.scala:1109)
>>
>>        at org.scalatest.Suite.run$(Suite.scala:1094)
>>
>>        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
>>
>>        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$7245/0x000000080193e840.apply(Unknown
>> Source)
>>
>>        at scala.collection.immutable.List.foreach(List.scala:333)
>>
>>        at
>> org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1316)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
>>
>>        at
>> org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
>>
>>        at
>> org.scalatest.tools.Runner$$$Lambda$60/0x0000000800148040.apply(Unknown
>> Source)
>>
>>        at
>> org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
>>
>>        at
>> org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
>>
>>        at org.scalatest.tools.Runner$.main(Runner.scala:775)
>>
>>        at org.scalatest.tools.Runner.main(Runner.scala)
>>
>> ```
>>
>> I think the test case being executed is `SPARK-28323: PythonUDF should be
>> able to use in join condition`, does anyone have the same problem?
>>
>>
>>
>> Yang Jie
>>
>>
>>
>>
>>
>> *发件人**: *huaxin gao <huaxin.ga...@gmail.com>
>> *日期**: *2022年11月15日 星期二 13:59
>> *收件人**: *"L. C. Hsieh" <vii...@gmail.com>
>> *抄送**: *Dongjoon Hyun <dongjoon.h...@gmail.com>, Chao Sun <
>> sunc...@apache.org>, dev <dev@spark.apache.org>
>> *主题**: *Re: [VOTE] Release Spark 3.2.3 (RC1)
>>
>>
>>
>> +1
>>
>>
>>
>> Thanks Chao!
>>
>>
>>
>> On Mon, Nov 14, 2022 at 9:37 PM L. C. Hsieh <vii...@gmail.com> wrote:
>>
>> +1
>>
>> Thanks Chao.
>>
>> On Mon, Nov 14, 2022 at 6:55 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
>> wrote:
>> >
>> > +1
>> >
>> > Thank you, Chao.
>> >
>> > On Mon, Nov 14, 2022 at 4:12 PM Chao Sun <sunc...@apache.org> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 3.2.3.
>> >>
>> >> The vote is open until 11:59pm Pacific time Nov 17th and passes if a
>> >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 3.2.3
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see http://spark.apache.org/
>> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
>> >>
>> >> The tag to be voted on is v3.2.3-rc1 (commit
>> >> b53c341e0fefbb33d115ab630369a18765b7763d):
>> >> https://github.com/apache/spark/tree/v3.2.3-rc1
>> <https://mailshield.baidu.com/check?q=1l6n2dPAt62Hg3fIHnm%2bZiYaoxYOpGmWVjYnQimr29zfPU2uzgcQkLawzjK1HF%2bdD1yLsQ%3d%3d>
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-bin/
>> <https://mailshield.baidu.com/check?q=%2bhYCySw46HEVZlrXDDlrMzy6tGHW57fkaPIHyKgUBg1o83YtunwZU58LlZ4ZuxMWCmqht4FYN7WaSUyqTl4vFw%3d%3d>
>> >>
>> >> Signatures used for Spark RCs can be found in this file:
>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
>> >>
>> >> The staging repository for this release can be found at:
>> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1431/
>> <https://mailshield.baidu.com/check?q=wDoqNv5amlDmohUUV9yoBzlZBzBNp%2boFTrpIiSJ1wVxVYO3fQIzfMzclIsTAx8QnfdrepoSTkqsCu6xyMsjDBjXEXXyCYM501j8CMA%3d%3d>
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.2.3-rc1-docs/
>> <https://mailshield.baidu.com/check?q=bkrbaqzrftuBwkOZuV2GMw%2b3MANrfqFNBDTqjFATKt3KVRHB2cw5JRGvO5UfSg%2bchMROqtBVhqX2%2bDX0hO0azrRJ%2fI4%3d>
>> >>
>> >> The list of bug fixes going into 3.2.3 can be found at the following
>> URL:
>> >> https://issues.apache.org/jira/projects/SPARK/versions/12352105
>> <https://mailshield.baidu.com/check?q=InWtXpCRA7pNM4sgk90HUJRIE0cQByBdwj%2bfh%2bAu9JfLFQp7oZSMs0uliYo08CplvvNo3kRvPzALmuBJNTz39yYCyUw%3d>
>> >>
>> >> This release is using the release script of the tag v3.2.3-rc1.
>> >>
>> >>
>> >> FAQ
>> >>
>> >> =========================
>> >> How can I help test this release?
>> >> =========================
>> >> If you are a Spark user, you can help us test this release by taking
>> >> an existing Spark workload and running on this release candidate, then
>> >> reporting any regressions.
>> >>
>> >> If you're working in PySpark you can set up a virtual env and install
>> >> the current RC and see if anything important breaks, in the Java/Scala
>> >> you can add the staging repository to your projects resolvers and test
>> >> with the RC (make sure to clean up the artifact cache before/after so
>> >> you don't end up building with a out of date RC going forward).
>> >>
>> >> ===========================================
>> >> What should happen to JIRA tickets still targeting 3.2.3?
>> >> ===========================================
>> >> The current list of open tickets targeted at 3.2.3 can be found at:
>> >> https://issues.apache.org/jira/projects/SPARK
>> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
>> and search for "Target
>> >> Version/s" = 3.2.3
>> >>
>> >> Committers should look at those and triage. Extremely important bug
>> >> fixes, documentation, and API tweaks that impact compatibility should
>> >> be worked on immediately. Everything else please retarget to an
>> >> appropriate release.
>> >>
>> >> ==================
>> >> But my bug isn't fixed?
>> >> ==================
>> >> In order to make timely releases, we will typically not hold the
>> >> release unless the bug in question is a regression from the previous
>> >> release. That being said, if there is something which is a regression
>> >> that has not been correctly targeted please ping me or a committer to
>> >> help target the issue.
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>

Reply via email to