Unsubscribe

2023-05-01 Thread sandeep vura
-- Sandeep V

Re: adding jars - hive on spark cdh 5.4.3

2016-01-10 Thread sandeep vura
Upgrade to CDH 5.5 for spark. It should work On Sat, Jan 9, 2016 at 12:17 AM, Ophir Etzion wrote: > It didn't work. assuming I did the right thing. > in the properties you could see > > {"key":"hive.aux.jars.path","value":"file:///data/loko/foursquare.web-hiverc/current/hadoop-hive-serde.jar,fi

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
Thanks alot AKhil On Mon, Jul 6, 2015 at 12:57 PM, sandeep vura wrote: > It Works !!! > > On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura > wrote: > >> oK Let me try >> >> >> On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das >> wrote: >> >>&

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
It Works !!! On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura wrote: > oK Let me try > > > On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das > wrote: > >> Its complaining for a jdbc driver. Add it in your driver classpath like: >> >> ./bin/spark-sql --driver-class-p

Re: Unable to start spark-sql

2015-07-06 Thread sandeep vura
egards > > On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura > wrote: > >> Hi Sparkers, >> >> I am unable to start spark-sql service please check the error as >> mentioned below. >> >> Exception in thread "main"

Unable to start spark-sql

2015-07-05 Thread sandeep vura
Hi Sparkers, I am unable to start spark-sql service please check the error as mentioned below. Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apache.hadoop.hive.ql.session

How to run spark programs in eclipse like mapreduce

2015-04-19 Thread sandeep vura
Hi Sparkers, I have written a code in python in eclipse now that code should execute in spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help me with instructions. Regards, Sandeep.v

Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
ed, Mar 25, 2015 at 5:34 PM, sandeep vura > wrote: > >> Build failed with following errors. >> >> I have executed the below following command. >> >> * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean >> package* >> >> >> [INF

Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
-Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package > > > ​ > > Thanks > Best Regards > > On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura > wrote: > >> Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh >> >> I am running the be

Re: Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
, Saisai Shao wrote: > Looks like you have to build Spark with related Hadoop version, otherwise > you will meet exception as mentioned. you could follow this doc: > http://spark.apache.org/docs/latest/building-spark.html > > 2015-03-25 15:22 GMT+08:00 sandeep vura : > >>

Server IPC version 9 cannot communicate with client version 4

2015-03-25 Thread sandeep vura
Hi Sparkers, I am trying to load data in spark with the following command *sqlContext.sql("LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");* *Getting exception below* *Server IPC version 9 cannot communicate with client version 4* NOte : i am using Hadoop 2.2 ve

Re: Errors in SPARK

2015-03-24 Thread sandeep vura
NOT EXISTS src (key INT, value STRING)")* *java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient* Cheers, Sandeep.v On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura wrote: > No I am just running ./spark-shell co

Re: Errors in SPARK

2015-03-24 Thread sandeep vura
run my spark-shell instance in standalone mode, I use: > ./spark-shell --master spark://servername:7077 --driver-class-path > /lib/mysql-connector-java-5.1.27.jar > > > > On Fri, Mar 13, 2015 at 8:31 AM sandeep vura > wrote: > >> Hi Sparkers, >> >> Can any

Re: About the env of Spark1.2

2015-03-21 Thread sandeep vura
Make sure if you are using 127.0.0.1 please check in /etc/hosts and uncheck or create 127.0.1.1 named it as localhost On Sat, Mar 21, 2015 at 9:57 AM, Ted Yu wrote: > bq. Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or > service not known > > Can you check your DNS ? > > Che

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
Hi, Please find the attached is my spark configuration files. Regards, Sandeep.v On Mon, Mar 16, 2015 at 12:58 PM, sandeep vura wrote: > which location should i need to specify the classpath exactly . > > Thanks, > > > On Mon, Mar 16, 2015 at 12:52 PM, Cheng, Hao wrote

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-16 Thread sandeep vura
which location should i need to specify the classpath exactly . Thanks, On Mon, Mar 16, 2015 at 12:52 PM, Cheng, Hao wrote: > It doesn’t take effect if just putting jar files under the > lib-managed/jars folder, you need to put that under class path explicitly. > > > > *F

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
t; > > *From:* fightf...@163.com [mailto:fightf...@163.com] > *Sent:* Monday, March 16, 2015 2:04 PM > *To:* sandeep vura; Ted Yu > *Cc:* user > *Subject:* Re: Re: Unable to instantiate > org.apache.hadoop.hive.metastore.HiveMetaStoreClient > > > > Hi, Sandeep >

Re: Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
0 ? > > Thanks, > Sun. > > -- > fightf...@163.com > > > *From:* sandeep vura > *Date:* 2015-03-16 14:13 > *To:* Ted Yu > *CC:* user@spark.apache.org > *Subject:* Re: Unable to instantiate > org.apache.hadoop.hive.metastore.HiveMeta

Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
Hi Ted, Did you find any solution. Thanks Sandeep On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura wrote: > Hi Ted, > > I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration > files attached below. > > ---

Re: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
; Such as: > Version of Spark you're using > Command line > > Thanks > > > > > On Mar 15, 2015, at 9:51 PM, sandeep vura wrote: > > > > Hi Sparkers, > > > > > > > > I couldn't able to run spark-sql on spark.Please f

Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

2015-03-15 Thread sandeep vura
Hi Sparkers, I couldn't able to run spark-sql on spark.Please find the following error Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient Regards, Sandeep.v

Errors in SPARK

2015-03-13 Thread sandeep vura
Hi Sparkers, Can anyone please check the below error and give solution for this.I am using hive version 0.13 and spark 1.2.1 . Step 1 : I have installed hive 0.13 with local metastore (mySQL database) Step 2: Hive is running without any errors and able to create tables and loading data in hive t

Re: Spark-SQL and Hive - is Hive required?

2015-03-06 Thread sandeep vura
Hi , For creating a Hive table do i need to add hive-site.xml in spark/conf directory. On Fri, Mar 6, 2015 at 11:12 PM, Michael Armbrust wrote: > Its not required, but even if you don't have hive installed you probably > still want to use the HiveContext. From earlier in that doc: > > In addit

Does anyone integrate HBASE on Spark

2015-03-04 Thread sandeep vura
Hi Sparkers, How do i integrate hbase on spark !!! Appreciate for replies !! Regards, Sandeep.v

Re: Errors in spark

2015-02-27 Thread sandeep vura
re pointing to an > external metastore, in my experience I've also had to copy core-site.xml > into conf in order to specify this property: fs.defaultFS > > On Fri, Feb 27, 2015 at 10:39 AM, sandeep vura > wrote: > >> Hi Sparkers, >> >> I am using hive versi

Errors in spark

2015-02-27 Thread sandeep vura
Hi Sparkers, I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf and using default derby local metastore . While creating a table in spark shell getting the following error ..Can any one please look and give solution asap.. sqlContext.sql("CREATE TABLE IF NOT EXISTS sandee

Re: Unable to run hive queries inside spark

2015-02-27 Thread sandeep vura
Hi Kundan, Sorry even i am also facing the similar issue today.How did you resolve this issue? Regards, Sandeep.v On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust wrote: > It looks like that is getting interpreted as a local path. Are you > missing a core-site.xml file to configure hdfs? > >

Re: Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
in > the CLASSPATH. Please check your CLASSPATH specification, and the name of > the driver. > > Cheng > > On 2/26/15 8:03 PM, sandeep vura wrote: > >Hi Sparkers, > > I am trying to creating hive table in SparkSql.But couldn't able to > create it.Belo

Re: Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
0 or Hive 0.13.1. > > > On 2/27/15 12:12 AM, sandeep vura wrote: > > Hi Cheng, > > Thanks the above issue has been resolved.I have configured Remote > metastore not Local metastore in Hive. > > While creating a table in sparksql another error reflecting on

Creating hive table on spark ((ERROR))

2015-02-26 Thread sandeep vura
Hi Sparkers, I am trying to creating hive table in SparkSql.But couldn't able to create it.Below are the following errors which are generating so far. java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient at org.apac

Re: How to integrate HBASE on Spark

2015-02-23 Thread sandeep vura
at 10:53 PM, Deepak Vohra wrote: > Or, use the SparkOnHBase lab. > http://blog.cloudera.com/blog/2014/12/new-in-cloudera-labs-sparkonhbase/ > > -- > *From:* Ted Yu > *To:* Akhil Das > *Cc:* sandeep vura ; "user@spark.apache.org" <

Re: How to integrate HBASE on Spark

2015-02-23 Thread sandeep vura
cluster. If you install it on the spark > cluster itself, then hbase might take up a few cpu cycles and there's a > chance for the job to lag. > > Thanks > Best Regards > > On Mon, Feb 23, 2015 at 12:48 PM, sandeep vura > wrote: > >> Hi >> >> I had

How to integrate HBASE on Spark

2015-02-22 Thread sandeep vura
Hi I had installed spark on 3 node cluster. Spark services are up and running.But i want to integrate hbase on spark Do i need to install HBASE on hadoop cluster or spark cluster. Please let me know asap. Regards, Sandeep.v