That seems like an important concern. I'm going to go ahead and vote -1 on this RC and I'll roll a new RC once the IndyLambda support is backported into the 2.4 branch.
On Mon, May 18, 2020 at 2:58 PM DB Tsai <dbt...@dbtsai.com> wrote: > I am changing my vote from +1 to +0. > > Since Spark 3.0 is Scala 2.12 only, having a transitional 2.4.x > release with great support of Scala 2.12 is very important. I would > like to have [SPARK-31399][CORE] Support indylambda Scala closure in > ClosureCleaner backported. Without it, it might break users' code when > upgrading from Scala 2.11 to Scala 2.12. > > Thanks, > > Sincerely, > > DB Tsai > ---------------------------------------------------------- > Web: https://www.dbtsai.com > PGP Key ID: 42E5B25A8F7A82C1 > > > Sincerely, > > DB Tsai > ---------------------------------------------------------- > Web: https://www.dbtsai.com > PGP Key ID: 42E5B25A8F7A82C1 > > > On Mon, May 18, 2020 at 2:47 PM Holden Karau <hol...@pigscanfly.ca> wrote: > > > > Another two candidates for backporting that have come up since this RC > are SPARK-31692 & SPARK-31399. What are folks thoughts, should we roll an > RC4? > > > > On Mon, May 18, 2020 at 2:13 PM Sean Owen <sro...@apache.org> wrote: > >> > >> Ah OK, I assumed from the timing that this was cut to include that > commit. I should have looked. > >> Yes, it is not strictly a regression so does not have to block the > release and this can pass. We can release 2.4.7 in a few months, too. > >> How important is the fix? If it's pretty important, it may still be > useful to run one more RC, if it's not too much trouble. > >> > >> On Mon, May 18, 2020 at 11:25 AM Holden Karau <hol...@pigscanfly.ca> > wrote: > >>> > >>> That is correct. I asked on the PR if that was ok with folks before I > moved forward with the RC and was told that it was ok. I believe that > particular bug is not a regression and is a long standing issue so we > wouldn’t normally block the release on it. > >>> > >>> On Mon, May 18, 2020 at 7:40 AM Xiao Li <lix...@databricks.com> wrote: > >>>> > >>>> This RC does not include the correctness bug fix > https://github.com/apache/spark/commit/a4885f3654899bcb852183af70cc0a82e7dd81d0 > which is just after RC3 cut. > >>>> > >>>> On Mon, May 18, 2020 at 7:21 AM Tom Graves > <tgraves...@yahoo.com.invalid> wrote: > >>>>> > >>>>> +1. > >>>>> > >>>>> Tom > >>>>> > >>>>> On Monday, May 18, 2020, 08:05:24 AM CDT, Wenchen Fan < > cloud0...@gmail.com> wrote: > >>>>> > >>>>> > >>>>> +1, no known blockers. > >>>>> > >>>>> On Mon, May 18, 2020 at 12:49 AM DB Tsai <dbt...@dbtsai.com> wrote: > >>>>> > >>>>> +1 as well. Thanks. > >>>>> > >>>>> On Sun, May 17, 2020 at 7:39 AM Sean Owen <sro...@apache.org> wrote: > >>>>> > >>>>> +1 , same response as to the last RC. > >>>>> This looks like it includes the fix discussed last time, as well as a > >>>>> few more small good fixes. > >>>>> > >>>>> On Sat, May 16, 2020 at 12:08 AM Holden Karau <hol...@pigscanfly.ca> > wrote: > >>>>> > > >>>>> > Please vote on releasing the following candidate as Apache Spark > version 2.4.6. > >>>>> > > >>>>> > The vote is open until May 22nd at 9AM PST and passes if a > majority +1 PMC votes are cast, with a minimum of 3 +1 votes. > >>>>> > > >>>>> > [ ] +1 Release this package as Apache Spark 2.4.6 > >>>>> > [ ] -1 Do not release this package because ... > >>>>> > > >>>>> > To learn more about Apache Spark, please see > http://spark.apache.org/ > >>>>> > > >>>>> > There are currently no issues targeting 2.4.6 (try project = SPARK > AND "Target Version/s" = "2.4.6" AND status in (Open, Reopened, "In > Progress")) > >>>>> > > >>>>> > The tag to be voted on is v2.4.6-rc3 (commit > 570848da7c48ba0cb827ada997e51677ff672a39): > >>>>> > https://github.com/apache/spark/tree/v2.4.6-rc3 > >>>>> > > >>>>> > The release files, including signatures, digests, etc. can be > found at: > >>>>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.6-rc3-bin/ > >>>>> > > >>>>> > Signatures used for Spark RCs can be found in this file: > >>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS > >>>>> > > >>>>> > The staging repository for this release can be found at: > >>>>> > > https://repository.apache.org/content/repositories/orgapachespark-1344/ > >>>>> > > >>>>> > The documentation corresponding to this release can be found at: > >>>>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.6-rc3-docs/ > >>>>> > > >>>>> > The list of bug fixes going into 2.4.6 can be found at the > following URL: > >>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12346781 > >>>>> > > >>>>> > This release is using the release script of the tag v2.4.6-rc3. > >>>>> > > >>>>> > FAQ > >>>>> > > >>>>> > ========================= > >>>>> > What happened to RC2? > >>>>> > ========================= > >>>>> > > >>>>> > My computer crashed part of the way through RC2, so I rolled RC3. > >>>>> > > >>>>> > ========================= > >>>>> > How can I help test this release? > >>>>> > ========================= > >>>>> > > >>>>> > If you are a Spark user, you can help us test this release by > taking > >>>>> > an existing Spark workload and running on this release candidate, > then > >>>>> > reporting any regressions. > >>>>> > > >>>>> > If you're working in PySpark you can set up a virtual env and > install > >>>>> > the current RC and see if anything important breaks, in the > Java/Scala > >>>>> > you can add the staging repository to your projects resolvers and > test > >>>>> > with the RC (make sure to clean up the artifact cache before/after > so > >>>>> > you don't end up building with an out of date RC going forward). > >>>>> > > >>>>> > =========================================== > >>>>> > What should happen to JIRA tickets still targeting 2.4.6? > >>>>> > =========================================== > >>>>> > > >>>>> > The current list of open tickets targeted at 2.4.6 can be found at: > >>>>> > https://issues.apache.org/jira/projects/SPARK and search for > "Target Version/s" = 2.4.6 > >>>>> > > >>>>> > Committers should look at those and triage. Extremely important bug > >>>>> > fixes, documentation, and API tweaks that impact compatibility > should > >>>>> > be worked on immediately. Everything else please retarget to an > >>>>> > appropriate release. > >>>>> > > >>>>> > ================== > >>>>> > But my bug isn't fixed? > >>>>> > ================== > >>>>> > > >>>>> > In order to make timely releases, we will typically not hold the > >>>>> > release unless the bug in question is a regression from the > previous > >>>>> > release. That being said, if there is something which is a > regression > >>>>> > that has not been correctly targeted please ping me or a committer > to > >>>>> > help target the issue. > >>>>> > > >>>>> > -- > >>>>> > Twitter: https://twitter.com/holdenkarau > >>>>> > Books (Learning Spark, High Performance Spark, etc.): > https://amzn.to/2MaRAG9 > >>>>> > YouTube Live Streams: https://www.youtube.com/user/holdenkarau > >>>>> > >>>>> --------------------------------------------------------------------- > >>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >>>>> > >>>>> -- > >>>>> - DB Sent from my iPhone > >>>> > >>>> > >>>> > >>>> -- > >>> > >>> -- > >>> Twitter: https://twitter.com/holdenkarau > >>> Books (Learning Spark, High Performance Spark, etc.): > https://amzn.to/2MaRAG9 > >>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau > > > > > > > > -- > > Twitter: https://twitter.com/holdenkarau > > Books (Learning Spark, High Performance Spark, etc.): > https://amzn.to/2MaRAG9 > > YouTube Live Streams: https://www.youtube.com/user/holdenkarau > -- Twitter: https://twitter.com/holdenkarau Books (Learning Spark, High Performance Spark, etc.): https://amzn.to/2MaRAG9 <https://amzn.to/2MaRAG9> YouTube Live Streams: https://www.youtube.com/user/holdenkarau