--
Sandeep V
Upgrade to CDH 5.5 for spark. It should work
On Sat, Jan 9, 2016 at 12:17 AM, Ophir Etzion wrote:
> It didn't work. assuming I did the right thing.
> in the properties you could see
>
> {"key":"hive.aux.jars.path","value":"file:///data/loko/foursquare.web-hiverc/current/hadoop-hive-serde.jar,fi
Thanks alot AKhil
On Mon, Jul 6, 2015 at 12:57 PM, sandeep vura wrote:
> It Works !!!
>
> On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura
> wrote:
>
>> oK Let me try
>>
>>
>> On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das
>> wrote:
>>
>>&
It Works !!!
On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura wrote:
> oK Let me try
>
>
> On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das
> wrote:
>
>> Its complaining for a jdbc driver. Add it in your driver classpath like:
>>
>> ./bin/spark-sql --driver-class-p
egards
>
> On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura
> wrote:
>
>> Hi Sparkers,
>>
>> I am unable to start spark-sql service please check the error as
>> mentioned below.
>>
>> Exception in thread "main"
Hi Sparkers,
I am unable to start spark-sql service please check the error as mentioned
below.
Exception in thread "main" java.lang.RuntimeException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apache.hadoop.hive.ql.session
Hi Sparkers,
I have written a code in python in eclipse now that code should execute in
spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help
me with instructions.
Regards,
Sandeep.v
ed, Mar 25, 2015 at 5:34 PM, sandeep vura
> wrote:
>
>> Build failed with following errors.
>>
>> I have executed the below following command.
>>
>> * mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
>> package*
>>
>>
>> [INF
-Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package
>
>
>
>
> Thanks
> Best Regards
>
> On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura
> wrote:
>
>> Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh
>>
>> I am running the be
, Saisai Shao
wrote:
> Looks like you have to build Spark with related Hadoop version, otherwise
> you will meet exception as mentioned. you could follow this doc:
> http://spark.apache.org/docs/latest/building-spark.html
>
> 2015-03-25 15:22 GMT+08:00 sandeep vura :
>
>>
Hi Sparkers,
I am trying to load data in spark with the following command
*sqlContext.sql("LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
' INTO TABLE src");*
*Getting exception below*
*Server IPC version 9 cannot communicate with client version 4*
NOte : i am using Hadoop 2.2 ve
NOT EXISTS src (key INT, value
STRING)")*
*java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient*
Cheers,
Sandeep.v
On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura
wrote:
> No I am just running ./spark-shell co
run my spark-shell instance in standalone mode, I use:
> ./spark-shell --master spark://servername:7077 --driver-class-path
> /lib/mysql-connector-java-5.1.27.jar
>
>
>
> On Fri, Mar 13, 2015 at 8:31 AM sandeep vura
> wrote:
>
>> Hi Sparkers,
>>
>> Can any
Make sure if you are using 127.0.0.1 please check in /etc/hosts and uncheck
or create 127.0.1.1 named it as localhost
On Sat, Mar 21, 2015 at 9:57 AM, Ted Yu wrote:
> bq. Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or
> service not known
>
> Can you check your DNS ?
>
> Che
Hi,
Please find the attached is my spark configuration files.
Regards,
Sandeep.v
On Mon, Mar 16, 2015 at 12:58 PM, sandeep vura
wrote:
> which location should i need to specify the classpath exactly .
>
> Thanks,
>
>
> On Mon, Mar 16, 2015 at 12:52 PM, Cheng, Hao wrote
which location should i need to specify the classpath exactly .
Thanks,
On Mon, Mar 16, 2015 at 12:52 PM, Cheng, Hao wrote:
> It doesn’t take effect if just putting jar files under the
> lib-managed/jars folder, you need to put that under class path explicitly.
>
>
>
> *F
t;
>
> *From:* fightf...@163.com [mailto:fightf...@163.com]
> *Sent:* Monday, March 16, 2015 2:04 PM
> *To:* sandeep vura; Ted Yu
> *Cc:* user
> *Subject:* Re: Re: Unable to instantiate
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>
>
>
> Hi, Sandeep
>
0 ?
>
> Thanks,
> Sun.
>
> --
> fightf...@163.com
>
>
> *From:* sandeep vura
> *Date:* 2015-03-16 14:13
> *To:* Ted Yu
> *CC:* user@spark.apache.org
> *Subject:* Re: Unable to instantiate
> org.apache.hadoop.hive.metastore.HiveMeta
Hi Ted,
Did you find any solution.
Thanks
Sandeep
On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura
wrote:
> Hi Ted,
>
> I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
> files attached below.
>
> ---
; Such as:
> Version of Spark you're using
> Command line
>
> Thanks
>
>
>
> > On Mar 15, 2015, at 9:51 PM, sandeep vura wrote:
> >
> > Hi Sparkers,
> >
> >
> >
> > I couldn't able to run spark-sql on spark.Please f
Hi Sparkers,
I couldn't able to run spark-sql on spark.Please find the following error
Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
Regards,
Sandeep.v
Hi Sparkers,
Can anyone please check the below error and give solution for this.I am
using hive version 0.13 and spark 1.2.1 .
Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
Step 2: Hive is running without any errors and able to create tables and
loading data in hive t
Hi ,
For creating a Hive table do i need to add hive-site.xml in spark/conf
directory.
On Fri, Mar 6, 2015 at 11:12 PM, Michael Armbrust
wrote:
> Its not required, but even if you don't have hive installed you probably
> still want to use the HiveContext. From earlier in that doc:
>
> In addit
Hi Sparkers,
How do i integrate hbase on spark !!!
Appreciate for replies !!
Regards,
Sandeep.v
re pointing to an
> external metastore, in my experience I've also had to copy core-site.xml
> into conf in order to specify this property: fs.defaultFS
>
> On Fri, Feb 27, 2015 at 10:39 AM, sandeep vura
> wrote:
>
>> Hi Sparkers,
>>
>> I am using hive versi
Hi Sparkers,
I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf
and using default derby local metastore .
While creating a table in spark shell getting the following error ..Can any
one please look and give solution asap..
sqlContext.sql("CREATE TABLE IF NOT EXISTS sandee
Hi Kundan,
Sorry even i am also facing the similar issue today.How did you resolve
this issue?
Regards,
Sandeep.v
On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust
wrote:
> It looks like that is getting interpreted as a local path. Are you
> missing a core-site.xml file to configure hdfs?
>
>
in
> the CLASSPATH. Please check your CLASSPATH specification, and the name of
> the driver.
>
> Cheng
>
> On 2/26/15 8:03 PM, sandeep vura wrote:
>
>Hi Sparkers,
>
> I am trying to creating hive table in SparkSql.But couldn't able to
> create it.Belo
0 or Hive 0.13.1.
>
>
> On 2/27/15 12:12 AM, sandeep vura wrote:
>
> Hi Cheng,
>
> Thanks the above issue has been resolved.I have configured Remote
> metastore not Local metastore in Hive.
>
> While creating a table in sparksql another error reflecting on
Hi Sparkers,
I am trying to creating hive table in SparkSql.But couldn't able to create
it.Below are the following errors which are generating so far.
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
org.apac
at 10:53 PM, Deepak Vohra
wrote:
> Or, use the SparkOnHBase lab.
> http://blog.cloudera.com/blog/2014/12/new-in-cloudera-labs-sparkonhbase/
>
> --
> *From:* Ted Yu
> *To:* Akhil Das
> *Cc:* sandeep vura ; "user@spark.apache.org" <
cluster. If you install it on the spark
> cluster itself, then hbase might take up a few cpu cycles and there's a
> chance for the job to lag.
>
> Thanks
> Best Regards
>
> On Mon, Feb 23, 2015 at 12:48 PM, sandeep vura
> wrote:
>
>> Hi
>>
>> I had
Hi
I had installed spark on 3 node cluster. Spark services are up and
running.But i want to integrate hbase on spark
Do i need to install HBASE on hadoop cluster or spark cluster.
Please let me know asap.
Regards,
Sandeep.v
33 matches
Mail list logo