Hi, Karen.

Are you saying that Spark 3 has to have all deprecated 2.x APIs?
Could you tell us what is your criteria for `unnecessarily` or
`necessarily`?

> the migration process from Spark 2 to Spark 3 unnecessarily painful.

Bests,
Dongjoon.


On Tue, Feb 18, 2020 at 4:55 PM Karen Feng <karen.f...@databricks.com>
wrote:

> Hi all,
>
> I am concerned that the API-breaking changes in SPARK-25908 (as well as
> SPARK-16775, and potentially others) will make the migration process from
> Spark 2 to Spark 3 unnecessarily painful. For example, the removal of
> SQLContext.getOrCreate will break a large number of libraries currently
> built on Spark 2.
>
> Even if library developers do not use deprecated APIs, API changes between
> 2.x and 3.x will result in inconsistencies that require hacking around. For
> a fairly small and new (2.4.3+) genomics library, I had to create a number
> of shims (https://github.com/projectglow/glow/pull/155) for the source and
> test code due to API changes in SPARK-25393, SPARK-27328, SPARK-28744.
>
> It would be best practice to avoid breaking existing APIs to ease library
> development. To avoid dealing with similar deprecated API issues down the
> road, we should practice more prudence when considering new API proposals.
>
> I'd love to see more discussion on this.
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to