That is probably because Livy supports Spark 1.6 and "SparkSession"
does not exist in that version, so the code wouldn't compile
otherwise.

If you try to cast it to some other type, though, you'll get an
exception at run time.

On Sat, Dec 2, 2017 at 10:11 AM, kant kodali <[email protected]> wrote:
> Hi All,
>
> Why does jobContext.sparkSession() doesn't return a SparkSession Object
> instead it returns a parametrized type?
>
> jobContext.sc(); //returns JavaSparkContext so this is good
> jobContext.sqlContext();// returns SqlContext so this is good
> jobContext.steamingContext(); // returns StreamingContext so this is good
> jobContext.sparkSession(); // returns any parameterized type. why?
>
> since it returns a parametrized type I can assign it to anything I like that
> wouldn't make any sense.
>
> Integer k = jobContext.sparkSession()
>
> or
>
> Long l = jobContext.sparkSession()
>
>
>
> Below is the livy interface
>
>
> package org.apache.livy;
>
> import java.io.File;
> import org.apache.spark.api.java.JavaSparkContext;
> import org.apache.spark.sql.SQLContext;
> import org.apache.spark.sql.hive.HiveContext;
> import org.apache.spark.streaming.api.java.JavaStreamingContext;
>
> public interface JobContext {
>     JavaSparkContext sc();
>
>     SQLContext sqlctx();
>
>     HiveContext hivectx();
>
>     JavaStreamingContext streamingctx();
>
>     void createStreamingContext(long var1);
>
>     void stopStreamingCtx();
>
>     File getLocalTmpDir();
>
>     <E> E sparkSession() throws Exception;
> }
>
>



-- 
Marcelo

Reply via email to