If you want to keep using RDD API, then you still need to create SparkContext 
first.

If you want to use just Dataset/DataFrame/SQL API, then you can directly create 
a SparkSession. Generally the SparkContext is hidden although it is internally 
created and held within the SparkSession. Anytime you need the SparkContext, 
you can get it from SparkSession.sparkContext.   while SparkConf is accepted 
when creating a SparkSession, the formal way to set/get configurations for a 
SparkSession is through SparkSession.conf.set()/get()
> On Jul 27, 2016, at 21:02, Jestin Ma <jestinwith.a...@gmail.com> wrote:
> 
> I know that Sparksession is replacing the SQL and HiveContexts, but what 
> about SparkConf and SparkContext? Are those still relevant in our programs?
> 
> Thank you!
> Jestin



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to