[ 
https://issues.apache.org/jira/browse/SPARK-22560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ran Mingxuan reopened SPARK-22560:
----------------------------------

My method not working. Need support.

> Must create spark session directly to connect to hive
> -----------------------------------------------------
>
>                 Key: SPARK-22560
>                 URL: https://issues.apache.org/jira/browse/SPARK-22560
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API, SQL
>    Affects Versions: 2.1.0, 2.2.0
>            Reporter: Ran Mingxuan
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> In a java project I have to use both JavaSparkContext  and SparkSession. I 
> find the order to create them affect hive connection.
> I have built a spark job like below:
> {code:java}
> // wrong code
> public void main(String[] args)
> {
>     SparkConf sparkConf = new SparkConf().setAppName("testApp");
>     JavaSparkContext sc = new JavaSparkContext(sparkConf);
>     SparkSession spark = 
> SparkSession.builder().sparkContext(sc.sc()).enableHiveSupport().getOrCreate();
>     spark.sql("show databases").show();
> }
> {code}
> and with this code spark job will not be able to find hive meta-store even if 
> it can discover correct warehouse.
> I have to use code like below to make things work:
> {code:java}
> // correct code 
> public String main(String[] args)
> {
>     SparkConf sparkConf = new SparkConf().setAppName("testApp");
>     SparkSession spark = 
> SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();
>     SparkContext sparkContext = spark.sparkContext();
>     JavaSparkContext sc = JavaSparkContext.fromSparkContext(sparkContext);
>     spark.sql("show databases").show();
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to