It appears all 3 issues slated for Spark 2.4.7 have been merged. Should we
be looking at getting RC2 ready?


                                                                                
   
 Regards,                                                                       
   
                                                                                
   
 NICHOLAS T. MARION                                                             
   
 IBM Open Data Analytics for z/OS - CPO and Service Team Lead                   
   
                                                                                
   
                                                                                
     
                                                                                
     
                                                                                
     
 Phone: 1-845-433-5010 | Tie-Line: 293-5010                                     
     
 E-mail: nmar...@us.ibm.com                                                     
     
 Find me on:                                                           2455 
South Rd 
                                                    Poughkeepie, New York 
12601-5400 
                                                                                
     
                                                                       United 
States 
                                                                                
     
                                                                                
     
                                                                                
     
                                                                                
     
                                                                                
     








From:   Xiao Li <lix...@databricks.com>
To:     Prashant Sharma <scrapco...@gmail.com>
Cc:     Takeshi Yamamuro <linguin....@gmail.com>, dev
            <dev@spark.apache.org>
Date:   08/17/2020 11:33 AM
Subject:        [EXTERNAL] Re: [VOTE] Release Spark 2.4.7 (RC1)



https://issues.apache.org/jira/browse/SPARK-32609 got merged. This is to
fix a correctness bug in DSV2 of Spark 2.4. Please include it in the
upcoming Spark 2.4.7 release.

Thanks,

Xiao

On Sun, Aug 9, 2020 at 10:26 PM Prashant Sharma <scrapco...@gmail.com>
wrote:
  Thanks for letting us know. So this vote is cancelled in favor of RC2.



  On Sun, Aug 9, 2020 at 8:31 AM Takeshi Yamamuro <linguin....@gmail.com>
  wrote:
   Thanks for letting us know about the two issues above, Dongjoon.

   ----
   I've checked the release materials (signatures, tag, ...) and it looks
   fine, too.
   Also, I run the tests on my local Mac (java 1.8.0) with the options
   `-Pyarn -Phadoop-2.7 -Phive -Phive-thriftserver -Pmesos -Pkubernetes
   -Psparkr`
   and they passed.

   Bests,
   Takeshi



   On Sun, Aug 9, 2020 at 11:06 AM Dongjoon Hyun <dongjoon.h...@gmail.com>
   wrote:
     Another instance is SPARK-31703 which filed on May 13th and the PR
     arrived two days ago.

         [SPARK-31703][SQL] Parquet RLE float/double are read incorrectly
     on big endian platforms
         https://github.com/apache/spark/pull/29383

     It seems that the patch is already ready in this case.
     I raised the priority of SPARK-31703 to `Blocker` for both Apache
     Spark 2.4.7 and 3.0.1.

     Bests,
     Dongjoon.


     On Sat, Aug 8, 2020 at 6:10 AM Holden Karau <hol...@pigscanfly.ca>
     wrote:
      I'm going to go ahead and vote -0 then based on that then.

      On Fri, Aug 7, 2020 at 11:36 PM Dongjoon Hyun <
      dongjoon.h...@gmail.com> wrote:
        Hi, All.

        Unfortunately, there is an on-going discussion about the new
        decimal correctness.

        Although we fixed one correctness issue at master and backported it
        partially to 3.0/2.4, it turns out that it needs more patched to be
        complete.

        Please see https://github.com/apache/spark/pull/29125 for on-going
        discussion for both 3.0/2.4.

            [SPARK-32018][SQL][3.0] UnsafeRow.setDecimal should set null
        with overflowed value

        I also confirmed that 2.4.7 RC1 is affected.

        Bests,
        Dongjoon.


        On Thu, Aug 6, 2020 at 2:48 PM Sean Owen <sro...@apache.org> wrote:
          +1 from me. The same as usual. Licenses and sigs look OK, builds
          and
          passes tests on a standard selection of profiles.

          On Thu, Aug 6, 2020 at 7:07 AM Prashant Sharma <
          scrapco...@gmail.com> wrote:
          >
          > Please vote on releasing the following candidate as Apache
          Spark version 2.4.7.
          >
          > The vote is open until Aug 9th at 9AM PST and passes if a
          majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
          >
          > [ ] +1 Release this package as Apache Spark 2.4.7
          > [ ] -1 Do not release this package because ...
          >
          > To learn more about Apache Spark, please see
          http://spark.apache.org/
          >
          > There are currently no issues targeting 2.4.7 (try project =
          SPARK AND "Target Version/s" = "2.4.7" AND status in (Open,
          Reopened, "In Progress"))
          >
          > The tag to be voted on is v2.4.7-rc1 (commit
          dc04bf53fe821b7a07f817966c6c173f3b3788c6):
          > https://github.com/apache/spark/tree/v2.4.7-rc1
          >
          > The release files, including signatures, digests, etc. can be
          found at:
          > https://dist.apache.org/repos/dist/dev/spark/v2.4.7-rc1-bin/
          >
          > Signatures used for Spark RCs can be found in this file:
          > https://dist.apache.org/repos/dist/dev/spark/KEYS
          >
          > The staging repository for this release can be found at:
          >
          
https://repository.apache.org/content/repositories/orgapachespark-1352/

          >
          > The documentation corresponding to this release can be found
          at:
          > https://dist.apache.org/repos/dist/dev/spark/v2.4.7-rc1-docs/
          >
          > The list of bug fixes going into 2.4.7 can be found at the
          following URL:
          > https://s.apache.org/spark-v2.4.7-rc1
          >
          > This release is using the release script of the tag v2.4.7-rc1.
          >
          > FAQ
          >
          >
          > =========================
          > How can I help test this release?
          > =========================
          >
          > If you are a Spark user, you can help us test this release by
          taking
          > an existing Spark workload and running on this release
          candidate, then
          > reporting any regressions.
          >
          > If you're working in PySpark you can set up a virtual env and
          install
          > the current RC and see if anything important breaks, in the
          Java/Scala
          > you can add the staging repository to your projects resolvers
          and test
          > with the RC (make sure to clean up the artifact cache
          before/after so
          > you don't end up building with an out of date RC going
          forward).
          >
          > ===========================================
          > What should happen to JIRA tickets still targeting 2.4.7?
          > ===========================================
          >
          > The current list of open tickets targeted at 2.4.7 can be found
          at:
          > https://issues.apache.org/jira/projects/SPARK and search for
          "Target Version/s" = 2.4.7
          >
          > Committers should look at those and triage. Extremely important
          bug
          > fixes, documentation, and API tweaks that impact compatibility
          should
          > be worked on immediately. Everything else please retarget to an
          > appropriate release.
          >
          > ==================
          > But my bug isn't fixed?
          > ==================
          >
          > In order to make timely releases, we will typically not hold
          the
          > release unless the bug in question is a regression from the
          previous
          > release. That being said, if there is something which is a
          regression
          > that has not been correctly targeted please ping me or a
          committer to
          > help target the issue.

          ---------------------------------------------------------------------

          To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



      --
      Twitter: https://twitter.com/holdenkarau
      Books (Learning Spark, High Performance Spark, etc.):
      https://amzn.to/2MaRAG9
      YouTube Live Streams: https://www.youtube.com/user/holdenkarau


   --
   ---
   Takeshi Yamamuro


--


Reply via email to