Abhijit Bhole created TOREE-353:
-----------------------------------

             Summary: Spark defaulting to internal metastore instead of hive
                 Key: TOREE-353
                 URL: https://issues.apache.org/jira/browse/TOREE-353
             Project: TOREE
          Issue Type: Bug
    Affects Versions: 0.1.0
         Environment: Vanilla Hadoop 2.7.3, Hive 2.1.0, Spark 2.0.2 with Hadoop 
2.7 binary distribution, Toree build from git repo and then installed using pip 
from torte-pip folder
            Reporter: Abhijit Bhole


I cannot see Hive tables inside Toree kernel. 

For testing I have created a table in Hive and I execute the query "show tables 
from default" in spark. Some more details -

- I have added the hive-submit.conf in SPARK_HOME/conf directory. 

- pyspark and spark-submit scripts work file in both yarn and local mode 
without any additional arguments. The pyspark shell also works fine and returns 
the table name. 

- On the driver UI page (on 4040 port) I can see  
spark.sql.catalogImplementation set to hive when running pyspark shell. But the 
variable is missing when running Toree kernel.

- I have tried also giving --driver-class-path hive-site.xml --files 
/opt/hive-2.1.0/conf/hive-site.xml unsuccessfully

I have tried everything I could think of hence filing a bug report. I have 
missed something please help.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to