Hi Masf,

Do try the official Hbase Spark.
https://hbase.apache.org/book.html#spark

I think you will have to build the jar from source and run your spark
program with --packages <location-to-jar>.
https://spark-packages.org/package/hortonworks-spark/shc says it's not yet
published to Spark packages or Maven Repo.


Thanks
Sudev


On Sun, 29 Jan 2017 at 5:53 PM, Masf <masfwo...@gmail.com> wrote:

I´m trying to build an application where is necessary to do bulkGets and
bulkLoad on Hbase.

I think that I could use this component
https://github.com/hortonworks-spark/shc
*Is it a good option??*

But* I can't import it in my project*. Sbt cannot resolve hbase
connector....
This is my build.sbt:

version := "1.0"
scalaVersion := "2.10.6"

mainClass in assembly := Some("com.location.userTransaction")

assemblyOption in assembly ~= { _.copy(includeScala = false) }

resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/
"

val sparkVersion = "1.6.0"
val jettyVersion = "8.1.14.v20131031"
val hbaseConnectorVersion = "1.0.0-1.6-s_2.10"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-hive" % sparkVersion % "provided"
)
libraryDependencies += "com.hortonworks" % "shc" % hbaseConnectorVersion
libraryDependencies += "org.eclipse.jetty" % "jetty-client" % jettyVersion


-- 


Saludos.
Miguel Ángel

Reply via email to