Great job everyone !! Do we have any tentative GA dates yet?
Thanks and Regards,
Ajay.


On Tue, Dec 24, 2019 at 5:11 PM Star <s...@fallenstar.xyz> wrote:

> Awesome work. Thanks and happy holidays~!
>
>
> On 2019-12-25 04:52, Yuming Wang wrote:
> > Hi all,
> >
> > To enable wide-scale community testing of the upcoming Spark 3.0
> > release, the Apache Spark community has posted a new preview release
> > of Spark 3.0. This preview is not a stable release in terms of either
> > API or functionality, but it is meant to give the community early
> > access to try the code that will become Spark 3.0. If you would like
> > to test the release, please download it, and send feedback using
> > either the mailing lists [1] or JIRA [2].
> >
> > There are a lot of exciting new features added to Spark 3.0, including
> > Dynamic Partition Pruning, Adaptive Query Execution, Accelerator-aware
> > Scheduling, Data Source API with Catalog Supports, Vectorization in
> > SparkR, support of Hadoop 3/JDK 11/Scala 2.12, and many more. For a
> > full list of major features and changes in Spark 3.0.0-preview2,
> > please check the
> > thread(
> http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-3-0-preview-release-feature-list-and-major-changes-td28050.html
> > and
> >
> http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-3-0-preview-release-2-td28491.html
> ).
> >
> > We'd like to thank our contributors and users for their contributions
> > and early feedback to this release. This release would not have been
> > possible without you.
> >
> > To download Spark 3.0.0-preview2, head over to the download page:
> > https://archive.apache.org/dist/spark/spark-3.0.0-preview2
> >
> > Happy Holidays.
> >
> > Yuming
> >
> > Links:
> > ------
> > [1] https://spark.apache.org/community.html
> > [2]
> >
> https://issues.apache.org/jira/projects/SPARK?selectedItem=com.atlassian.jira.jira-projects-plugin%3Asummary-page
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to