Re: Hive_context

2016-05-24 Thread Ajay Chander
Hi Arun,

Thanks for your time. I was able to connect through JDBC Java client. But I
am not able to connect from my spark application. You think I missed any
configuration step with in the code? Somehow my application is not picking
up  hive-site.xml from my machine, I put it under the class
path  ${SPARK_HOME}/conf/ . It would be really helpful if anyone has any
sort of example in either Java or Scala? Thank you.

On Monday, May 23, 2016, Arun Natva  wrote:

> Can you try a hive JDBC java client from eclipse and query a hive table
> successfully ?
>
> This way we can narrow down where the issue is ?
>
>
> Sent from my iPhone
>
> On May 23, 2016, at 5:26 PM, Ajay Chander  > wrote:
>
> I downloaded the spark 1.5 untilities and exported SPARK_HOME pointing to
> it. I copied all the cluster configuration files(hive-site.xml,
> hdfs-site.xml etc files) inside the ${SPARK_HOME}/conf/ . My application
> looks like below,
>
>
> public class SparkSqlTest {
>
> public static void main(String[] args) {
>
>
> SparkConf sc = new SparkConf().setAppName("SQL_Test").setMaster("local");
>
> JavaSparkContext jsc = new JavaSparkContext(sc);
>
> HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(jsc
> .sc());
>
> DataFrame sampleDataFrame = hiveContext.sql("show tables");
>
> sampleDataFrame.show();
>
>
> }
>
> }
>
>
> I am expecting my application to return all the tables from the default
> database. But somehow it returns empty list. I am just wondering if I need
> to add anything to my code to point it to hive metastore. Thanks for your
> time. Any pointers are appreciated.
>
>
> Regards,
>
> Aj
>
>
> On Monday, May 23, 2016, Ajay Chander  > wrote:
>
>> Hi Everyone,
>>
>> I am building a Java Spark application in eclipse IDE. From my
>> application I want to use hiveContext to read tables from the remote
>> Hive(Hadoop cluster). On my machine I have exported $HADOOP_CONF_DIR =
>> {$HOME}/hadoop/conf/. This path has all the remote cluster conf details
>> like hive-site.xml, hdfs-site.xml ... Somehow I am not able to communicate
>> to remote cluster from my app. Is there any additional configuration work
>> that I am supposed to do to get it work? I specified master as 'local' in
>> the code. Thank you.
>>
>> Regards,
>> Aj
>>
>


Re: Hive_context

2016-05-23 Thread Arun Natva
Can you try a hive JDBC java client from eclipse and query a hive table 
successfully ?

This way we can narrow down where the issue is ?


Sent from my iPhone

> On May 23, 2016, at 5:26 PM, Ajay Chander  wrote:
> 
> I downloaded the spark 1.5 untilities and exported SPARK_HOME pointing to it. 
> I copied all the cluster configuration files(hive-site.xml, hdfs-site.xml etc 
> files) inside the ${SPARK_HOME}/conf/ . My application looks like below,
> 
> 
> public class SparkSqlTest {
> 
> public static void main(String[] args) {
> 
> 
> 
> SparkConf sc = new SparkConf().setAppName("SQL_Test").setMaster("local");
> 
> JavaSparkContext jsc = new JavaSparkContext(sc);
> 
> HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(jsc.sc());
> 
> DataFrame sampleDataFrame = hiveContext.sql("show tables");
> 
> sampleDataFrame.show();
> 
> 
> 
> }
> 
> }
> 
> 
> 
> I am expecting my application to return all the tables from the default 
> database. But somehow it returns empty list. I am just wondering if I need to 
> add anything to my code to point it to hive metastore. Thanks for your time. 
> Any pointers are appreciated.
> 
> 
> 
> Regards,
> 
> Aj
> 
> 
> 
>> On Monday, May 23, 2016, Ajay Chander  wrote:
>> Hi Everyone,
>> 
>> I am building a Java Spark application in eclipse IDE. From my application I 
>> want to use hiveContext to read tables from the remote Hive(Hadoop cluster). 
>> On my machine I have exported $HADOOP_CONF_DIR = {$HOME}/hadoop/conf/. This 
>> path has all the remote cluster conf details like hive-site.xml, 
>> hdfs-site.xml ... Somehow I am not able to communicate to remote cluster 
>> from my app. Is there any additional configuration work that I am supposed 
>> to do to get it work? I specified master as 'local' in the code. Thank you.
>> 
>> Regards,
>> Aj


Re: Hive_context

2016-05-23 Thread Ajay Chander
I downloaded the spark 1.5 untilities and exported SPARK_HOME pointing to
it. I copied all the cluster configuration files(hive-site.xml,
hdfs-site.xml etc files) inside the ${SPARK_HOME}/conf/ . My application
looks like below,


public class SparkSqlTest {

public static void main(String[] args) {


SparkConf sc = new SparkConf().setAppName("SQL_Test").setMaster("local");

JavaSparkContext jsc = new JavaSparkContext(sc);

HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(jsc
.sc());

DataFrame sampleDataFrame = hiveContext.sql("show tables");

sampleDataFrame.show();


}

}


I am expecting my application to return all the tables from the default
database. But somehow it returns empty list. I am just wondering if I need
to add anything to my code to point it to hive metastore. Thanks for your
time. Any pointers are appreciated.


Regards,

Aj


On Monday, May 23, 2016, Ajay Chander  wrote:

> Hi Everyone,
>
> I am building a Java Spark application in eclipse IDE. From my application
> I want to use hiveContext to read tables from the remote Hive(Hadoop
> cluster). On my machine I have exported $HADOOP_CONF_DIR =
> {$HOME}/hadoop/conf/. This path has all the remote cluster conf details
> like hive-site.xml, hdfs-site.xml ... Somehow I am not able to communicate
> to remote cluster from my app. Is there any additional configuration work
> that I am supposed to do to get it work? I specified master as 'local' in
> the code. Thank you.
>
> Regards,
> Aj
>


Hive_context

2016-05-23 Thread Ajay Chander
Hi Everyone,

I am building a Java Spark application in eclipse IDE. From my application
I want to use hiveContext to read tables from the remote Hive(Hadoop
cluster). On my machine I have exported $HADOOP_CONF_DIR =
{$HOME}/hadoop/conf/. This path has all the remote cluster conf details
like hive-site.xml, hdfs-site.xml ... Somehow I am not able to communicate
to remote cluster from my app. Is there any additional configuration work
that I am supposed to do to get it work? I specified master as 'local' in
the code. Thank you.

Regards,
Aj