Any update on this?
On Tuesday, 16 August 2016, Aneela Saleem wrote:
> Thanks Steve,
>
> I have gone through it's documentation, i did not get any idea how to
> install it. Can you help me?
>
> On Mon, Aug 15, 2016 at 4:23 PM, Steve Loughran
Thanks Steve,
I have gone through it's documentation, i did not get any idea how to
install it. Can you help me?
On Mon, Aug 15, 2016 at 4:23 PM, Steve Loughran
wrote:
>
> On 15 Aug 2016, at 08:29, Aneela Saleem wrote:
>
> Thanks Jacek!
>
> I
On 15 Aug 2016, at 08:29, Aneela Saleem
> wrote:
Thanks Jacek!
I have already set hbase.security.authentication property set to kerberos,
since Hbase with kerberos is working fine.
I tested again after correcting the typo but got same
Thanks Jacek!
I have already set hbase.security.authentication property set to kerberos,
since Hbase with kerberos is working fine.
I tested again after correcting the typo but got same error. Following is
the code, Please have a look:
System.setProperty("java.security.krb5.conf",
Hi Aneela,
My (little to no) understanding of how to make it work is to use
hbase.security.authentication property set to kerberos (see [1]).
Spark on YARN uses it to get the tokens for Hive, HBase et al (see
[2]). It happens when Client starts conversation to YARN RM (see [3]).
You should not
Thanks for your response Jacek!
Here is the code, how spark accesses HBase:
System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");
System.setProperty("java.security.auth.login.config",
"/etc/hbase/conf/zk-jaas.conf");
val hconf = HBaseConfiguration.create()
val tableName = "emp"
Hi,
How do you access HBase? What's the version of Spark?
(I don't see spark packages in the stack trace)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On
Hi all,
I'm trying to run a spark job that accesses HBase with security enabled.
When i run the following command:
*/usr/local/spark-2/bin/spark-submit --keytab /etc/hadoop/conf/spark.keytab
--principal spark/hadoop-master@platalyticsrealm --class
com.platalytics.example.spark.App --master yarn