"org.apache.hbase" % "hbase" % "0.98.9-hadoop2" % "provided",

There is no module in hbase 0.98.9 called hbase. But this would not be the
root cause of the error.

Most likely hbase-site.xml was not picked up. Meaning this is classpath
issue.

On Sun, Mar 15, 2015 at 10:04 AM, HARIPRIYA AYYALASOMAYAJULA <
aharipriy...@gmail.com> wrote:

> Hello all,
>
> Thank you for your responses. I did try to include the
> zookeeper.znode.parent property in the hbase-site.xml. It still continues
> to give the same error.
>
> I am using Spark 1.2.0 and hbase 0.98.9.
>
> Could you please suggest what else could be done?
>
>
> On Fri, Mar 13, 2015 at 10:25 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> In HBaseTest.scala:
>>     val conf = HBaseConfiguration.create()
>> You can add some log (for zookeeper.znode.parent, e.g.) to see if the
>> values from hbase-site.xml are picked up correctly.
>>
>> Please use pastebin next time you want to post errors.
>>
>> Which Spark release are you using ?
>> I assume it contains SPARK-1297
>>
>> Cheers
>>
>> On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA <
>> aharipriy...@gmail.com> wrote:
>>
>>>
>>> Hello,
>>>
>>> I am running a HBase test case. I am using the example from the
>>> following:
>>>
>>> https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala
>>>
>>> I created a very small HBase table with 5 rows and 2 columns.
>>> I have attached a screenshot of the error log. I believe it is a problem
>>> where the driver program is unable to establish connection to the hbase.
>>>
>>> The following is my simple.sbt:
>>>
>>> name := "Simple Project"
>>>
>>> version := "1.0"
>>>
>>> scalaVersion := "2.10.4"
>>>
>>> libraryDependencies ++= Seq(
>>>
>>>  "org.apache.spark" %% "spark-core" % "1.2.0",
>>>
>>>  "org.apache.hbase" % "hbase" % "0.98.9-hadoop2" % "provided",
>>>
>>>  "org.apache.hbase" % "hbase-client" % "0.98.9-hadoop2" % "provided",
>>>
>>>  "org.apache.hbase" % "hbase-server" % "0.98.9-hadoop2" % "provided",
>>>
>>>  "org.apache.hbase" % "hbase-common" % "0.98.9-hadoop2" % "provided"
>>> )
>>>
>>> I am using a 23 node cluster, did copy hbase-site.xml into /spark/conf
>>> folder
>>> and set spark.executor.extraClassPath pointing to the /hbase/ folder in
>>> the spark-defaults.conf
>>>
>>> Also, while submitting the spark job I am including the required jars :
>>>
>>> spark-submit --class "HBaseTest" --master yarn-cluster
>>> --driver-class-path
>>>  
>>> /opt/hbase/0.98.9/lib/hbase-server-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-protocol-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-client-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-common-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/htrace-core-2.04.jar
>>>  /home/priya/usingHBase/Spark/target/scala-2.10/simple-project_2.10-1.0.jar
>>> /Priya/sparkhbase-test1
>>>
>>> It would be great if you could point where I am going wrong, and what
>>> could be done to correct it.
>>>
>>> Thank you for your time.
>>> --
>>> Regards,
>>> Haripriya Ayyalasomayajula
>>> Graduate Student
>>> Department of Computer Science
>>> University of Houston
>>> Contact : 650-796-7112
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>>
>
>
> --
> Regards,
> Haripriya Ayyalasomayajula
> Graduate Student
> Department of Computer Science
> University of Houston
> Contact : 650-796-7112
>

Reply via email to