Hi! Unfortunately, hbase-spark support is not yet available from the Apache HBase project.
The jira for tracking efforts to get the hbase-spark module working with branch-1 and released can be followed, it's HBASE-14160. In the mean-time, I believe the "Spark on HBase" project, which was the start of the effort, is your best option: https://github.com/tmalaska/SparkOnHBase On Sat, Dec 12, 2015 at 8:45 AM, Germain tanguy <[email protected]> wrote: > Hello everybody, > > I would like to use bulkPut, bulkGet.. and SparkSQL with HBase. > > I already read the documentation : HBase and Spark > <https://hbase.apache.org/book.html#spark>. > > I also saw the example on github : > hbase/hbase-spark/src/main/scala/org/apache/hbase/spark/example > < > https://github.com/apache/hbase/tree/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/example > > > . > > I tried bulkGet and to read from HBase with SparkSQL, but I couldn't make > it work. > > Did someone succeed to make it work? If so can you give me some tips and > the versions of Scala, Spark, HBase and Hadoop that you used? > > My versions : > > - Spark: 1.5.2 > - HBase: 1.1.2 > - Scala: 2.11.4 > - Hadoop : 2.7.1 > > Extract of my build.sbt : > > libraryDependencies ++= Seq( > "org.apache.hbase" % "hbase-server" % "1.1.2" excludeAll > ExclusionRule(organization = "org.mortbay.jetty"), > "org.apache.hbase" % "hbase-common" % "1.1.2" excludeAll > ExclusionRule(organization = "javax.servlet"), > "org.apache.spark" %% "spark-core" % "1.5.2", > "org.apache.spark" %% "spark-sql" % "1.5.2", > "org.apache.hbase" % "hbase-spark" % "2.0.0-SNAPSHOT" > ) > > Best regards, > > Germain Tanguy. > -- Sean
