Hi All,
Why does jobContext.sparkSession() doesn't return a SparkSession Object
instead it returns a parametrized type?
jobContext.sc(); //returns JavaSparkContext so this is good
jobContext.sqlContext();// returns SqlContext so this is good
jobContext.steamingContext(); // returns StreamingContext so this is good
jobContext.sparkSession(); // returns any parameterized type. why?
since it returns a parametrized type I can assign it to anything I like
that wouldn't make any sense.
Integer k = jobContext.sparkSession()
or
Long l = jobContext.sparkSession()
Below is the livy interface
package org.apache.livy;
import java.io.File;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.SQLContext;
import org.apache.spark.sql.hive.HiveContext;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
public interface JobContext {
JavaSparkContext sc();
SQLContext sqlctx();
HiveContext hivectx();
JavaStreamingContext streamingctx();
void createStreamingContext(long var1);
void stopStreamingCtx();
File getLocalTmpDir();
<E> E sparkSession() throws Exception;
}