Re: Problem connecting to HBase
Hello Ted, Yes, I can understand what you are suggesting. But I am unable to decipher where I am going wrong, could you please point out what are the locations to be looked at to be able to find and correct the mistake? I greatly appreciate your help! On Sun, Mar 15, 2015 at 1:10 PM, Ted Yu yuzhih...@gmail.com wrote: org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, There is no module in hbase 0.98.9 called hbase. But this would not be the root cause of the error. Most likely hbase-site.xml was not picked up. Meaning this is classpath issue. On Sun, Mar 15, 2015 at 10:04 AM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello all, Thank you for your responses. I did try to include the zookeeper.znode.parent property in the hbase-site.xml. It still continues to give the same error. I am using Spark 1.2.0 and hbase 0.98.9. Could you please suggest what else could be done? On Fri, Mar 13, 2015 at 10:25 PM, Ted Yu yuzhih...@gmail.com wrote: In HBaseTest.scala: val conf = HBaseConfiguration.create() You can add some log (for zookeeper.znode.parent, e.g.) to see if the values from hbase-site.xml are picked up correctly. Please use pastebin next time you want to post errors. Which Spark release are you using ? I assume it contains SPARK-1297 Cheers On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello, I am running a HBase test case. I am using the example from the following: https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala I created a very small HBase table with 5 rows and 2 columns. I have attached a screenshot of the error log. I believe it is a problem where the driver program is unable to establish connection to the hbase. The following is my simple.sbt: name := Simple Project version := 1.0 scalaVersion := 2.10.4 libraryDependencies ++= Seq( org.apache.spark %% spark-core % 1.2.0, org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-client % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-server % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-common % 0.98.9-hadoop2 % provided ) I am using a 23 node cluster, did copy hbase-site.xml into /spark/conf folder and set spark.executor.extraClassPath pointing to the /hbase/ folder in the spark-defaults.conf Also, while submitting the spark job I am including the required jars : spark-submit --class HBaseTest --master yarn-cluster --driver-class-path /opt/hbase/0.98.9/lib/hbase-server-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-protocol-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-client-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-common-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/htrace-core-2.04.jar /home/priya/usingHBase/Spark/target/scala-2.10/simple-project_2.10-1.0.jar /Priya/sparkhbase-test1 It would be great if you could point where I am going wrong, and what could be done to correct it. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112 - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112 -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112
Re: Problem connecting to HBase
Hello all, Thank you for your responses. I did try to include the zookeeper.znode.parent property in the hbase-site.xml. It still continues to give the same error. I am using Spark 1.2.0 and hbase 0.98.9. Could you please suggest what else could be done? On Fri, Mar 13, 2015 at 10:25 PM, Ted Yu yuzhih...@gmail.com wrote: In HBaseTest.scala: val conf = HBaseConfiguration.create() You can add some log (for zookeeper.znode.parent, e.g.) to see if the values from hbase-site.xml are picked up correctly. Please use pastebin next time you want to post errors. Which Spark release are you using ? I assume it contains SPARK-1297 Cheers On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello, I am running a HBase test case. I am using the example from the following: https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala I created a very small HBase table with 5 rows and 2 columns. I have attached a screenshot of the error log. I believe it is a problem where the driver program is unable to establish connection to the hbase. The following is my simple.sbt: name := Simple Project version := 1.0 scalaVersion := 2.10.4 libraryDependencies ++= Seq( org.apache.spark %% spark-core % 1.2.0, org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-client % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-server % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-common % 0.98.9-hadoop2 % provided ) I am using a 23 node cluster, did copy hbase-site.xml into /spark/conf folder and set spark.executor.extraClassPath pointing to the /hbase/ folder in the spark-defaults.conf Also, while submitting the spark job I am including the required jars : spark-submit --class HBaseTest --master yarn-cluster --driver-class-path /opt/hbase/0.98.9/lib/hbase-server-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-protocol-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-client-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-common-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/htrace-core-2.04.jar /home/priya/usingHBase/Spark/target/scala-2.10/simple-project_2.10-1.0.jar /Priya/sparkhbase-test1 It would be great if you could point where I am going wrong, and what could be done to correct it. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112 - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112
Re: Problem connecting to HBase
org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, There is no module in hbase 0.98.9 called hbase. But this would not be the root cause of the error. Most likely hbase-site.xml was not picked up. Meaning this is classpath issue. On Sun, Mar 15, 2015 at 10:04 AM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello all, Thank you for your responses. I did try to include the zookeeper.znode.parent property in the hbase-site.xml. It still continues to give the same error. I am using Spark 1.2.0 and hbase 0.98.9. Could you please suggest what else could be done? On Fri, Mar 13, 2015 at 10:25 PM, Ted Yu yuzhih...@gmail.com wrote: In HBaseTest.scala: val conf = HBaseConfiguration.create() You can add some log (for zookeeper.znode.parent, e.g.) to see if the values from hbase-site.xml are picked up correctly. Please use pastebin next time you want to post errors. Which Spark release are you using ? I assume it contains SPARK-1297 Cheers On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello, I am running a HBase test case. I am using the example from the following: https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala I created a very small HBase table with 5 rows and 2 columns. I have attached a screenshot of the error log. I believe it is a problem where the driver program is unable to establish connection to the hbase. The following is my simple.sbt: name := Simple Project version := 1.0 scalaVersion := 2.10.4 libraryDependencies ++= Seq( org.apache.spark %% spark-core % 1.2.0, org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-client % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-server % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-common % 0.98.9-hadoop2 % provided ) I am using a 23 node cluster, did copy hbase-site.xml into /spark/conf folder and set spark.executor.extraClassPath pointing to the /hbase/ folder in the spark-defaults.conf Also, while submitting the spark job I am including the required jars : spark-submit --class HBaseTest --master yarn-cluster --driver-class-path /opt/hbase/0.98.9/lib/hbase-server-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-protocol-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-client-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-common-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/htrace-core-2.04.jar /home/priya/usingHBase/Spark/target/scala-2.10/simple-project_2.10-1.0.jar /Priya/sparkhbase-test1 It would be great if you could point where I am going wrong, and what could be done to correct it. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112 - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112
Re: Problem connecting to HBase
Hi, there You may want to check your hbase config. e.g. the following property can be changed to /hbase property namezookeeper.znode.parent/name value/hbase-unsecure/value /property fightf...@163.com From: HARIPRIYA AYYALASOMAYAJULA Date: 2015-03-14 10:47 To: user Subject: Problem connecting to HBase Hello, I am running a HBase test case. I am using the example from the following: https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala I created a very small HBase table with 5 rows and 2 columns. I have attached a screenshot of the error log. I believe it is a problem where the driver program is unable to establish connection to the hbase. The following is my simple.sbt: name := Simple Project version := 1.0 scalaVersion := 2.10.4 libraryDependencies ++= Seq( org.apache.spark %% spark-core % 1.2.0, org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-client % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-server % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-common % 0.98.9-hadoop2 % provided ) I am using a 23 node cluster, did copy hbase-site.xml into /spark/conf folder and set spark.executor.extraClassPath pointing to the /hbase/ folder in the spark-defaults.conf Also, while submitting the spark job I am including the required jars : spark-submit --class HBaseTest --master yarn-cluster --driver-class-path /opt/hbase/0.98.9/lib/hbase-server-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-protocol-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-client-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-common-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/htrace-core-2.04.jar /home/priya/usingHBase/Spark/target/scala-2.10/simple-project_2.10-1.0.jar /Priya/sparkhbase-test1 It would be great if you could point where I am going wrong, and what could be done to correct it. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112 邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。 共有 1 个附件 Screen Shot 2015-03-13 at 2.08.27 PM.png(131K) 极速下载 在线预览
Re: Problem connecting to HBase
In HBaseTest.scala: val conf = HBaseConfiguration.create() You can add some log (for zookeeper.znode.parent, e.g.) to see if the values from hbase-site.xml are picked up correctly. Please use pastebin next time you want to post errors. Which Spark release are you using ? I assume it contains SPARK-1297 Cheers On Fri, Mar 13, 2015 at 7:47 PM, HARIPRIYA AYYALASOMAYAJULA aharipriy...@gmail.com wrote: Hello, I am running a HBase test case. I am using the example from the following: https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala I created a very small HBase table with 5 rows and 2 columns. I have attached a screenshot of the error log. I believe it is a problem where the driver program is unable to establish connection to the hbase. The following is my simple.sbt: name := Simple Project version := 1.0 scalaVersion := 2.10.4 libraryDependencies ++= Seq( org.apache.spark %% spark-core % 1.2.0, org.apache.hbase % hbase % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-client % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-server % 0.98.9-hadoop2 % provided, org.apache.hbase % hbase-common % 0.98.9-hadoop2 % provided ) I am using a 23 node cluster, did copy hbase-site.xml into /spark/conf folder and set spark.executor.extraClassPath pointing to the /hbase/ folder in the spark-defaults.conf Also, while submitting the spark job I am including the required jars : spark-submit --class HBaseTest --master yarn-cluster --driver-class-path /opt/hbase/0.98.9/lib/hbase-server-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-protocol-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-client-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/hbase-common-0.98.9-hadoop2.jar:/opt/hbase/0.98.9/lib/htrace-core-2.04.jar /home/priya/usingHBase/Spark/target/scala-2.10/simple-project_2.10-1.0.jar /Priya/sparkhbase-test1 It would be great if you could point where I am going wrong, and what could be done to correct it. Thank you for your time. -- Regards, Haripriya Ayyalasomayajula Graduate Student Department of Computer Science University of Houston Contact : 650-796-7112 - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org