Thanks all!

Let me check all exceptions and submit a PR. Will do it now if nobody created a 
PR yet.

- Anton

On 2023/04/05 19:09:13 Xiao Li wrote:
> Hi, Anton,
> 
> Could you please provide a complete list of exceptions that are being used
> in the public connector API?
> 
> Thanks,
> 
> Xiao
> 
> Xinrong Meng <xinrong.apa...@gmail.com> 于2023年4月5日周三 12:06写道:
> 
> > Thank you!
> >
> > I created a blocker Jira for that for easier tracking:
> > https://issues.apache.org/jira/browse/SPARK-43041.
> >
> >
> > On Wed, Apr 5, 2023 at 11:20 AM Gengliang Wang <ltn...@gmail.com> wrote:
> >
> >> Hi Anton,
> >>
> >> +1 for adding the old constructors back!
> >> Could you raise a PR for this? I will review it ASAP.
> >>
> >> Thanks
> >> Gengliang
> >>
> >> On Wed, Apr 5, 2023 at 9:37 AM Anton Okolnychyi <aokolnyc...@apache.org>
> >> wrote:
> >>
> >>> Sorry, I think my last message did not land on the list.
> >>>
> >>> I have a question about changes to exceptions used in the public
> >>> connector API, such as NoSuchTableException and 
> >>> TableAlreadyExistsException.
> >>>
> >>> I consider those as part of the public Catalog API (TableCatalog uses
> >>> them in method definitions). However, it looks like PR #37887 has changed
> >>> them in an incompatible way. Old constructors accepting Identifier objects
> >>> got removed. The only way to construct such exceptions is either by 
> >>> passing
> >>> database and table strings or Scala Seq. Shall we add back old 
> >>> constructors
> >>> to avoid breaking connectors?
> >>>
> >>> [1] - https://github.com/apache/spark/pull/37887/
> >>> [2] - https://issues.apache.org/jira/browse/SPARK-40360
> >>> [3] -
> >>> https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/NoSuchItemException.scala
> >>>
> >>> - Anton
> >>>
> >>> On 2023/04/05 16:23:52 Xinrong Meng wrote:
> >>> > Considering the above blockers have been resolved, I am about to
> >>> > cut v3.4.0-rc6 if no objections.
> >>> >
> >>> > On Tue, Apr 4, 2023 at 8:20 AM Xinrong Meng <xinrong.apa...@gmail.com>
> >>> > wrote:
> >>> >
> >>> > > Thank you Wenchen for the report. I marked them as blockers just now.
> >>> > >
> >>> > > On Tue, Apr 4, 2023 at 10:52 AM Wenchen Fan <cloud0...@gmail.com>
> >>> wrote:
> >>> > >
> >>> > >> Sorry for the last-minute change, but we found two wrong behaviors
> >>> and
> >>> > >> want to fix them before the release:
> >>> > >>
> >>> > >> https://github.com/apache/spark/pull/40641
> >>> > >> We missed a corner case when the input index for `array_insert` is
> >>> 0. It
> >>> > >> should fail as 0 is an invalid index.
> >>> > >>
> >>> > >> https://github.com/apache/spark/pull/40623
> >>> > >> We found some usability issues with a new API and need to change
> >>> the API
> >>> > >> to fix it. If people have concerns we can also remove the new API
> >>> entirely.
> >>> > >>
> >>> > >> Thus I'm -1 to this RC. I'll merge these 2 PRs today if no
> >>> objections.
> >>> > >>
> >>> > >> Thanks,
> >>> > >> Wenchen
> >>> > >>
> >>> > >> On Tue, Apr 4, 2023 at 3:47 AM L. C. Hsieh <vii...@gmail.com>
> >>> wrote:
> >>> > >>
> >>> > >>> +1
> >>> > >>>
> >>> > >>> Thanks Xinrong.
> >>> > >>>
> >>> > >>> On Mon, Apr 3, 2023 at 12:35 PM Dongjoon Hyun <
> >>> dongjoon.h...@gmail.com>
> >>> > >>> wrote:
> >>> > >>> >
> >>> > >>> > +1
> >>> > >>> >
> >>> > >>> > I also verified that RC5 has SBOM artifacts.
> >>> > >>> >
> >>> > >>> >
> >>> > >>>
> >>> https://repository.apache.org/content/repositories/orgapachespark-1439/org/apache/spark/spark-core_2.12/3.4.0/spark-core_2.12-3.4.0-cyclonedx.json
> >>> > >>> >
> >>> > >>>
> >>> https://repository.apache.org/content/repositories/orgapachespark-1439/org/apache/spark/spark-core_2.13/3.4.0/spark-core_2.13-3.4.0-cyclonedx.json
> >>> > >>> >
> >>> > >>> > Thanks,
> >>> > >>> > Dongjoon.
> >>> > >>> >
> >>> > >>> >
> >>> > >>> >
> >>> > >>> > On Mon, Apr 3, 2023 at 1:57 AM yangjie01 <yangji...@baidu.com>
> >>> wrote:
> >>> > >>> >>
> >>> > >>> >> +1, checked Java 17 + Scala 2.13 + Python 3.10.10.
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >> 发件人: Herman van Hovell <her...@databricks.com.INVALID>
> >>> > >>> >> 日期: 2023年3月31日 星期五 12:12
> >>> > >>> >> 收件人: Sean Owen <sro...@apache.org>
> >>> > >>> >> 抄送: Xinrong Meng <xinrong.apa...@gmail.com>, dev <
> >>> > >>> dev@spark.apache.org>
> >>> > >>> >> 主题: Re: [VOTE] Release Apache Spark 3.4.0 (RC5)
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >> +1
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >> On Thu, Mar 30, 2023 at 11:05 PM Sean Owen <sro...@apache.org>
> >>> wrote:
> >>> > >>> >>
> >>> > >>> >> +1 same result from me as last time.
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >> On Thu, Mar 30, 2023 at 3:21 AM Xinrong Meng <
> >>> > >>> xinrong.apa...@gmail.com> wrote:
> >>> > >>> >>
> >>> > >>> >> Please vote on releasing the following candidate(RC5) as Apache
> >>> Spark
> >>> > >>> version 3.4.0.
> >>> > >>> >>
> >>> > >>> >> The vote is open until 11:59pm Pacific time April 4th and
> >>> passes if a
> >>> > >>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> >>> > >>> >>
> >>> > >>> >> [ ] +1 Release this package as Apache Spark 3.4.0
> >>> > >>> >> [ ] -1 Do not release this package because ...
> >>> > >>> >>
> >>> > >>> >> To learn more about Apache Spark, please see
> >>> http://spark.apache.org/
> >>> > >>> >>
> >>> > >>> >> The tag to be voted on is v3.4.0-rc5 (commit
> >>> > >>> f39ad617d32a671e120464e4a75986241d72c487):
> >>> > >>> >> https://github.com/apache/spark/tree/v3.4.0-rc5
> >>> > >>> >>
> >>> > >>> >> The release files, including signatures, digests, etc. can be
> >>> found
> >>> > >>> at:
> >>> > >>> >> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc5-bin/
> >>> > >>> >>
> >>> > >>> >> Signatures used for Spark RCs can be found in this file:
> >>> > >>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>> > >>> >>
> >>> > >>> >> The staging repository for this release can be found at:
> >>> > >>> >>
> >>> > >>>
> >>> https://repository.apache.org/content/repositories/orgapachespark-1439
> >>> > >>> >>
> >>> > >>> >> The documentation corresponding to this release can be found at:
> >>> > >>> >> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc5-docs/
> >>> > >>> >>
> >>> > >>> >> The list of bug fixes going into 3.4.0 can be found at the
> >>> following
> >>> > >>> URL:
> >>> > >>> >> https://issues.apache.org/jira/projects/SPARK/versions/12351465
> >>> > >>> >>
> >>> > >>> >> This release is using the release script of the tag v3.4.0-rc5.
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >> FAQ
> >>> > >>> >>
> >>> > >>> >> =========================
> >>> > >>> >> How can I help test this release?
> >>> > >>> >> =========================
> >>> > >>> >> If you are a Spark user, you can help us test this release by
> >>> taking
> >>> > >>> >> an existing Spark workload and running on this release
> >>> candidate, then
> >>> > >>> >> reporting any regressions.
> >>> > >>> >>
> >>> > >>> >> If you're working in PySpark you can set up a virtual env and
> >>> install
> >>> > >>> >> the current RC and see if anything important breaks, in the
> >>> Java/Scala
> >>> > >>> >> you can add the staging repository to your projects resolvers
> >>> and test
> >>> > >>> >> with the RC (make sure to clean up the artifact cache
> >>> before/after so
> >>> > >>> >> you don't end up building with an out of date RC going forward).
> >>> > >>> >>
> >>> > >>> >> ===========================================
> >>> > >>> >> What should happen to JIRA tickets still targeting 3.4.0?
> >>> > >>> >> ===========================================
> >>> > >>> >> The current list of open tickets targeted at 3.4.0 can be found
> >>> at:
> >>> > >>> >> https://issues.apache.org/jira/projects/SPARK and search for
> >>> "Target
> >>> > >>> Version/s" = 3.4.0
> >>> > >>> >>
> >>> > >>> >> Committers should look at those and triage. Extremely important
> >>> bug
> >>> > >>> >> fixes, documentation, and API tweaks that impact compatibility
> >>> should
> >>> > >>> >> be worked on immediately. Everything else please retarget to an
> >>> > >>> >> appropriate release.
> >>> > >>> >>
> >>> > >>> >> ==================
> >>> > >>> >> But my bug isn't fixed?
> >>> > >>> >> ==================
> >>> > >>> >> In order to make timely releases, we will typically not hold the
> >>> > >>> >> release unless the bug in question is a regression from the
> >>> previous
> >>> > >>> >> release. That being said, if there is something which is a
> >>> regression
> >>> > >>> >> that has not been correctly targeted please ping me or a
> >>> committer to
> >>> > >>> >> help target the issue.
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>> >> Thanks,
> >>> > >>> >>
> >>> > >>> >> Xinrong Meng
> >>> > >>> >>
> >>> > >>> >>
> >>> > >>>
> >>> > >>>
> >>> ---------------------------------------------------------------------
> >>> > >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>> > >>>
> >>> > >>>
> >>> >
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>>
> >>>
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to