System.setProperty("java.security.krb5.conf",
config.getJSONObject("auth").getString("krb5"))
val conf = HBaseConfiguration.create()
val zookeeper = config.getString("zookeeper")
val port = config.getString("port")
conf.set(HConstants.ZOOKEEPER_QUORUM, zookeeper)
Dear All
I saw there is one Hbase-spark module in Hbase code and saw there is one
jira for this: https://issues.apache.org/jira/browse/HBASE-13992
In this jira it's told the Hbase-spark module code initially from
https://github.com/cloudera-labs/SparkOnHBase
And in anther discuss list it's
Dear All
I saw there is one Hbase-spark module in Hbase code and saw there is one
jira for this: https://issues.apache.org/jira/browse/HBASE-13992
In this jira it's told the Hbase-spark module code initially from
https://github.com/cloudera-labs/SparkOnHBase
And in anther discuss list it's
Dear All
I saw there is one Hbase-spark module in Hbase code and saw there is one
jira for this: https://issues.apache.org/jira/browse/HBASE-13992
In this jira it's told the Hbase-spark module code initially from
https://github.com/cloudera-labs/SparkOnHBase
And in anther discuss list it's
Dear All
I saw there is one Hbase-spark module in Hbase code and saw there is one
jira for this: https://issues.apache.org/jira/browse/HBASE-13992
In this jira it's told the Hbase-spark module code initially from
https://github.com/cloudera-labs/SparkOnHBase
And in anther discuss list it's
The hbase-spark module in the HBase project (which hasn't yet made it
into a release) is FWICT the eventual replacement for both the
Cloudera Labs SparkOnHBase and the Hortonworks SHC.
The code in the hbase-spark module started as an update of the
SparkOnHBase code and then quickly expanded via
Dear All
I saw there is one Hbase-spark module in Hbase code and saw there is one
jira for this: https://issues.apache.org/jira/browse/HBASE-13992
In this jira it's told the Hbase-spark module code initially from
https://github.com/cloudera-labs/SparkOnHBase
And in anther discuss list it's
Dear All
I saw there is one Hbase-spark module in Hbase code and saw there is one
jira for this: https://issues.apache.org/jira/browse/HBASE-13992
In this jira it's told the Hbase-spark module code initially from
https://github.com/cloudera-labs/SparkOnHBase
And in anther discuss list it's
Replace with ":"
Regards,
Dhaval Modi
On 19 December 2016 at 13:10, Rabin Banerjee <dev.rabin.baner...@gmail.com>
wrote:
> HI All,
>
> I am trying to save data from Spark into HBase using saveHadoopDataSet
> API . Please refer the below code . Code is working fine
Thanks , It worked !!
On Mon, Dec 19, 2016 at 5:55 PM, Dhaval Modi <dhavalmod...@gmail.com> wrote:
>
> Replace with ":"
>
> Regards,
> Dhaval Modi
>
> On 19 December 2016 at 13:10, Rabin Banerjee <dev.rabin.baner...@gmail.com
> > wrote:
>
>&
HI All,
I am trying to save data from Spark into HBase using saveHadoopDataSet
API . Please refer the below code . Code is working fine .But the table is
getting stored in the default namespace.how to set the NameSpace in the
below code?
wordCounts.foreachRDD ( rdd = {
val conf
View this message in context: http://apache-hbase.679495.n3.
> nabble.com/Issues-with-Spark-On-Hbase-Connector-tp4082151p4082162.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
Thanks Sachin.
So it won't work with hbase 1.2.0 even if we use your code from shc branch?
--
View this message in context:
http://apache-hbase.679495.n3.nabble.com/Issues-with-Spark-On-Hbase-Connector-tp4082151p4082162.html
Sent from the HBase User mailing list archive at Nabble.com.
ortonworks.com
> >
> > FYI
> >
> > On Sun, Aug 28, 2016 at 6:45 PM, spats <spatil.sud...@gmail.com> wrote:
> >
> > > Regarding hbase connector by hortonworks
> > > https://github.com/hortonworks-spark/shc, it would be great if someone
> &g
ail.com> wrote:
>
> > Regarding hbase connector by hortonworks
> > https://github.com/hortonworks-spark/shc, it would be great if someone
> can
> > answer these
> >
> > 1. What versions of Hbase & Spark expected? I could not run examples
> > provided usi
can
> answer these
>
> 1. What versions of Hbase & Spark expected? I could not run examples
> provided using spark 1.6.0 & hbase 1.2.0
> 2. I get error when i run example provided here , any pointers on what i am
> doing wrong?
>
> looks like spark not reading hb
Regarding hbase connector by hortonworks
https://github.com/hortonworks-spark/shc, it would be great if someone can
answer these
1. What versions of Hbase & Spark expected? I could not run examples
provided using spark 1.6.0 & hbase 1.2.0
2. I get error when i run example provided he
1.3, so I'd be interested
in getting feedback.
On Fri, Apr 22, 2016 at 7:01 PM, sudhir patil <spatil.sud...@gmail.com> wrote:
> Connecting to kerborised hbase from spark is fixed in spark 1.4 , don't
> think 1.3 works because kerberos issues.
> https://issues.apache.org/jira/plugins
Connecting to kerborised hbase from spark is fixed in spark 1.4 , don't
think 1.3 works because kerberos issues.
https://issues.apache.org/jira/plugins/servlet/mobile#issue/SPARK-6918
On Apr 23, 2016 5:35 AM, "Sean Busbey" <bus...@cloudera.com> wrote:
> The HBase-Downs
The HBase-Downstreamer project has an example that uses the Java API
for Spark Streaming on a secure cluster:
https://github.com/saintstack/hbase-downstreamer#spark-streaming-test-application
https://s.apache.org/apvQ
We'd greatly like a Scala version.
On Fri, Apr 22, 2016 at 4:16 PM, Nkechi
echi Achara <nkach...@googlemail.com>
> wrote:
> > Hi,
> >
> > I am attempting to use both SpakOnHbase and Hbase-Spark, but I keep
> > receiving dependency isues, and I am not sure if any of these connectors
> > are available for Spark 1.3 / Hadoop 2.6 /
Are you attempting to use Spark's Java API or its Scala API? (or python, etc?)
On Fri, Apr 22, 2016 at 2:24 PM, Nkechi Achara <nkach...@googlemail.com> wrote:
> Hi,
>
> I am attempting to use both SpakOnHbase and Hbase-Spark, but I keep
> receiving dependency isues, and I
Hi,
I am attempting to use both SpakOnHbase and Hbase-Spark, but I keep
receiving dependency isues, and I am not sure if any of these connectors
are available for Spark 1.3 / Hadoop 2.6 / Hbase 1.0.
Has anyone got a new library I can use, or a concrete example to use?
Thanks,
Keech
forward you this mails, hope these can help you, you can take a look
at this post
http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html
2016-03-04 3:30 GMT+01:00 Divya Gehlot <divya.htco...@gmail.com>:
> Hi Teng,
>
> Thanks for the link you shared , helpe
ndency in maven repository, the
> only dependency i can find is
>
>
> org.apache.hbase
> hbase-spark
> 1.2.0-cdh5.7.0
>
>
> from cloudera maven repository,
>
> dependency specified in this page, was not able to resolve
>
> http://hbase.apache.org/h
Thank you for the reply,
i am having trouble in finding out the dependency in maven repository, the
only dependency i can find is
org.apache.hbase
hbase-spark
1.2.0-cdh5.7.0
from cloudera maven repository,
dependency specified in this page, was not able to resolve
http
There are some outstanding bug fixes, e.g. HBASE-15333, for hbase-spark
module.
FYI
On Tue, Apr 5, 2016 at 2:36 PM, Nkechi Achara <nkach...@googlemail.com>
wrote:
> So Hbase-spark is a continuation of the spark on hbase project, but within
> the Hbase project.
> They are not
So Hbase-spark is a continuation of the spark on hbase project, but within
the Hbase project.
They are not any significant differences apart from the fact that Spark on
hbase is not updated.
Dependent on the version you are using it would be more beneficial to use
Hbase-Spark
Kay
On 5 Apr 2016 9
i have cloudera cluster,
i am exploring spark with HBase,
after going through this blog
http://blog.cloudera.com/blog/2014/11/how-to-do-near-real-time-sessionization-with-spark-streaming-and-apache-hadoop/
i found two options for using Spark with HBase,
Cloudera's Spark on HBase or
Apache
Hello Rachana!
I use HBase with Spark Streaming. My solution is to create a singleton
with HConnection object for each JVM and refer there from
foreachPartition(), creating table connection (it's cheap according to
HBase documentation) for each streaming iteration. I close connections
Apart from Phoenix Spark connector. You can also have a look at:
https://github.com/Huawei-Spark/Spark-SQL-on-HBase
On Wed, Mar 9, 2016 at 4:58 PM, Divya Gehlot <divya.htco...@gmail.com>
wrote:
> I agree with Talat
> As couldn't connect directly with Hbase
> Connecting it
onnection and broadcast the connection each API
> call to get data from HBase is very expensive. I tried using
> JavaHBaseContext (JavaHBaseContext hbaseContext = new JavaHBaseContext(jsc,
> conf)) by using hbase-spark library in CDH 5.5 but I cannot import the
> library from maven. Has a
baseTable);
> > Since I cannot get the connection and broadcast the connection each API
> > call to get data from HBase is very expensive. I tried using
> > JavaHBaseContext (JavaHBaseContext hbaseContext = new
> JavaHBaseContext(jsc,
> > conf)) by using hbase-spark library in CD
ast the connection each API call
> to get data from HBase is very expensive. I tried using JavaHBaseContext
> (JavaHBaseContext hbaseContext = new JavaHBaseContext(jsc, conf)) by using
> hbase-spark library in CDH 5.5 but I cannot import the library from maven.
> Has anyone been
very expensive. I tried using
> JavaHBaseContext (JavaHBaseContext hbaseContext = new JavaHBaseContext(jsc,
> conf)) by using hbase-spark library in CDH 5.5 but I cannot import the
> library from maven. Has anyone been able to successfully resolve this
> issue.
>
--
busbey
API call to
get data from HBase is very expensive. I tried using JavaHBaseContext
(JavaHBaseContext hbaseContext = new JavaHBaseContext(jsc, conf)) by using
hbase-spark library in CDH 5.5 but I cannot import the library from maven. Has
anyone been able to successfully resolve this issue.
Hi,
I am getting error when I am trying to connect hive table (which is being
created through HbaseIntegration) in spark
Steps I followed :
*Hive Table creation code *:
CREATE EXTERNAL TABLE IF NOT EXISTS TEST(NAME STRING,AGE INT)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH
37 matches
Mail list logo