Hi Sparkers,
I couldn't able to run spark-sql on spark.Please find the following error
Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
Regards,
Sandeep.v
information ?
Such as:
Version of Spark you're using
Command line
Thanks
On Mar 15, 2015, at 9:51 PM, sandeep vura sandeepv...@gmail.com wrote:
Hi Sparkers,
I couldn't able to run spark-sql on spark.Please find the following error
Unable to instantiate
Make sure if you are using 127.0.0.1 please check in /etc/hosts and uncheck
or create 127.0.1.1 named it as localhost
On Sat, Mar 21, 2015 at 9:57 AM, Ted Yu yuzhih...@gmail.com wrote:
bq. Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or
service not known
Can you check
:* sandeep vura [mailto:sandeepv...@gmail.com]
*Sent:* Monday, March 16, 2015 2:21 PM
*To:* Cheng, Hao
*Cc:* fightf...@163.com; Ted Yu; user
*Subject:* Re: Re: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
I have already added mysql-connector-xx.jar file in spark
Hi Ted,
Did you find any solution.
Thanks
Sandeep
On Mon, Mar 16, 2015 at 10:44 AM, sandeep vura sandeepv...@gmail.com
wrote:
Hi Ted,
I am using Spark -1.2.1 and hive -0.13.1 you can check my configuration
files attached below.
ERROR IN SPARK
.
--
fightf...@163.com
*From:* sandeep vura sandeepv...@gmail.com
*Date:* 2015-03-16 14:13
*To:* Ted Yu yuzhih...@gmail.com
*CC:* user@spark.apache.org
*Subject:* Re: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
Hi Ted,
Did you
*From:* fightf...@163.com [mailto:fightf...@163.com]
*Sent:* Monday, March 16, 2015 2:04 PM
*To:* sandeep vura; Ted Yu
*Cc:* user
*Subject:* Re: Re: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
Hi, Sandeep
From your error log I can see that jdbc driver
Hi ,
For creating a Hive table do i need to add hive-site.xml in spark/conf
directory.
On Fri, Mar 6, 2015 at 11:12 PM, Michael Armbrust mich...@databricks.com
wrote:
Its not required, but even if you don't have hive installed you probably
still want to use the HiveContext. From earlier in
Hi Sparkers,
Can anyone please check the below error and give solution for this.I am
using hive version 0.13 and spark 1.2.1 .
Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
Step 2: Hive is running without any errors and able to create tables and
loading data in hive
src (key INT, value
STRING))*
*java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient*
Cheers,
Sandeep.v
On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura sandeepv...@gmail.com
wrote:
No I am just running ./spark-shell
Hi Sparkers,
I am trying to load data in spark with the following command
*sqlContext.sql(LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
' INTO TABLE src);*
*Getting exception below*
*Server IPC version 9 cannot communicate with client version 4*
NOte : i am using Hadoop 2.2
I run my spark-shell instance in standalone mode, I use:
./spark-shell --master spark://servername:7077 --driver-class-path
/lib/mysql-connector-java-5.1.27.jar
On Fri, Mar 13, 2015 at 8:31 AM sandeep vura sandeepv...@gmail.com
wrote:
Hi Sparkers,
Can anyone please check the below error
, Saisai Shao sai.sai.s...@gmail.com
wrote:
Looks like you have to build Spark with related Hadoop version, otherwise
you will meet exception as mentioned. you could follow this doc:
http://spark.apache.org/docs/latest/building-spark.html
2015-03-25 15:22 GMT+08:00 sandeep vura sandeepv
:
mvn -Pyarn -Phadoop-2.4 -D*hadoop.version=2.2* -DskipTests clean package
Thanks
Best Regards
On Wed, Mar 25, 2015 at 3:08 PM, sandeep vura sandeepv...@gmail.com
wrote:
Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh
I am running the below command in spark/yarn
, 2015 at 5:34 PM, sandeep vura sandeepv...@gmail.com
wrote:
Build failed with following errors.
I have executed the below following command.
* mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean
package*
[INFO
Hi Kundan,
Sorry even i am also facing the similar issue today.How did you resolve
this issue?
Regards,
Sandeep.v
On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust mich...@databricks.com
wrote:
It looks like that is getting interpreted as a local path. Are you
missing a core-site.xml file
metastore, in my experience I've also had to copy core-site.xml
into conf in order to specify this property: namefs.defaultFS/name
On Fri, Feb 27, 2015 at 10:39 AM, sandeep vura sandeepv...@gmail.com
wrote:
Hi Sparkers,
I am using hive version - hive 0.13 and copied hive-site.xml in
spark
Hi Sparkers,
I am using hive version - hive 0.13 and copied hive-site.xml in spark/conf
and using default derby local metastore .
While creating a table in spark shell getting the following error ..Can any
one please look and give solution asap..
sqlContext.sql(CREATE TABLE IF NOT EXISTS
Hi Sparkers,
How do i integrate hbase on spark !!!
Appreciate for replies !!
Regards,
Sandeep.v
at 10:53 PM, Deepak Vohra dvohr...@yahoo.com.invalid
wrote:
Or, use the SparkOnHBase lab.
http://blog.cloudera.com/blog/2014/12/new-in-cloudera-labs-sparkonhbase/
--
*From:* Ted Yu yuzhih...@gmail.com
*To:* Akhil Das ak...@sigmoidanalytics.com
*Cc:* sandeep vura
Hi
I had installed spark on 3 node cluster. Spark services are up and
running.But i want to integrate hbase on spark
Do i need to install HBASE on hadoop cluster or spark cluster.
Please let me know asap.
Regards,
Sandeep.v
it on the hadoop cluster. If you install it on the spark
cluster itself, then hbase might take up a few cpu cycles and there's a
chance for the job to lag.
Thanks
Best Regards
On Mon, Feb 23, 2015 at 12:48 PM, sandeep vura sandeepv...@gmail.com
wrote:
Hi
I had installed spark on 3 node cluster
0.12.0 or Hive 0.13.1.
On 2/27/15 12:12 AM, sandeep vura wrote:
Hi Cheng,
Thanks the above issue has been resolved.I have configured Remote
metastore not Local metastore in Hive.
While creating a table in sparksql another error reflecting on terminal
. Below error is given below
/15 8:03 PM, sandeep vura wrote:
Hi Sparkers,
I am trying to creating hive table in SparkSql.But couldn't able to
create it.Below are the following errors which are generating so far.
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate
Hi Sparkers,
I am trying to creating hive table in SparkSql.But couldn't able to create
it.Below are the following errors which are generating so far.
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
Hi Sparkers,
I have written a code in python in eclipse now that code should execute in
spark cluster like mapreduce jobs in hadoop cluster.Can anyone please help
me with instructions.
Regards,
Sandeep.v
Thanks alot AKhil
On Mon, Jul 6, 2015 at 12:57 PM, sandeep vura sandeepv...@gmail.com wrote:
It Works !!!
On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura sandeepv...@gmail.com
wrote:
oK Let me try
On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Its
It Works !!!
On Mon, Jul 6, 2015 at 12:40 PM, sandeep vura sandeepv...@gmail.com wrote:
oK Let me try
On Mon, Jul 6, 2015 at 12:38 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Its complaining for a jdbc driver. Add it in your driver classpath like:
./bin/spark-sql --driver-class-path
Hi Sparkers,
I am unable to start spark-sql service please check the error as mentioned
below.
Exception in thread main java.lang.RuntimeException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at
On Mon, Jul 6, 2015 at 11:42 AM, sandeep vura sandeepv...@gmail.com
wrote:
Hi Sparkers,
I am unable to start spark-sql service please check the error as
mentioned below.
Exception in thread main java.lang.RuntimeException:
java.lang.RuntimeException: Unable to instantiate
Upgrade to CDH 5.5 for spark. It should work
On Sat, Jan 9, 2016 at 12:17 AM, Ophir Etzion wrote:
> It didn't work. assuming I did the right thing.
> in the properties you could see
>
>
--
Sandeep V
32 matches
Mail list logo