Hi ,

I tried setting the metastore and metastore_db location in the
*conf/hive-site.xml *to the directories created in spark bin folder (they
were created when I ran spark shell and used LocalHiveContext), but still
doesn't work

Do I need to same my RDD as a table through hive context to make this work?

Regards,
Gaurav

On Mon, Sep 22, 2014 at 6:30 PM, Yin Huai <huaiyin....@gmail.com> wrote:

> Hi Gaurav,
>
> Can you put hive-site.xml in conf/ and try again?
>
> Thanks,
>
> Yin
>
> On Mon, Sep 22, 2014 at 4:02 PM, gtinside <gtins...@gmail.com> wrote:
>
>> Hi ,
>>
>> I have been using spark shell to execute all SQLs. I am connecting to
>> Cassandra , converting the data in JSON and then running queries on it,  I
>> am using HiveContext (and not SQLContext) because of "explode "
>> functionality in it.
>>
>> I want to see how can I use Spark SQL CLI for directly running the queries
>> on saved table. I see metastore and metastore_db getting created in the
>> spark bin directory (my hive context is LocalHiveContext). I tried
>> executing
>> queries in spark-sql cli after putting in a hive-site.xml with metastore
>> and
>> metastore db directory same as the one in spark bin,  but it doesn't seem
>> to
>> be working. I am getting
>> "org.apache.hadoop.hive.ql.metadata.HiveException:
>> Unable to fetch table test_tbl".
>>
>> Is this possible ?
>>
>> Regards,
>> Gaurav
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-CLI-tp14840.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to