I think old APIs are still supported but u r advised to migrate
I migrated few apps from 1.6 to 2.0 with minimal changes....
Hth

On 10 Jan 2017 4:14 pm, "pradeepbill" <pradeep.b...@gmail.com> wrote:

> hi there, I am using spark 1.4 code and now we plan to move to spark 2.0,
> and
> when I check the documentation below, there are only a few features
> backward
> compatible, does that mean I have change most of my code , please advice.
>
> One of the largest changes in Spark 2.0 is the new updated APIs:
>
> Unifying DataFrame and Dataset: In Scala and Java, DataFrame and Dataset
> have been unified, i.e. DataFrame is just a type alias for Dataset of Row.
> In Python and R, given the lack of type safety, DataFrame is the main
> programming interface.
> *SparkSession: new entry point that replaces the old SQLContext and
> HiveContext for DataFrame and Dataset APIs. SQLContext and HiveContext are
> kept for backward compatibility.*
> A new, streamlined configuration API for SparkSession
> Simpler, more performant accumulator API
> A new, improved Aggregator API for typed aggregation in Datasets
>
>
> thanks
> Pradeep
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/backward-compatibility-tp28296.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to