[
https://issues.apache.org/jira/browse/HBASE-1697?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13280832#comment-13280832
]
Laxman commented on HBASE-1697:
-------------------------------
Thanks for the info Andrew. I'm discussing this issue with Eugene.
(ZOOKEEPER-1467)
We got struck with another problem in HBase client authentication.
Client is not able to establish connection with HBase server successfully.
Exception we got here:
{noformat}
2012-05-22 09:42:22,627 WARN org.apache.hadoop.ipc.SecureClient: Exception
encountered while connecting to the server : javax.security.sasl.SaslException:
GSS initiate failed [Caused by GSSException: No valid credentials provided
(Mechanism level: Failed to find any Kerberos tgt)]
2012-05-22 09:42:22,627 ERROR org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:testuser (auth:KERBEROS)
cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate
failed [Caused by GSSException: No valid credentials provided (Mechanism level:
Failed to find any Kerberos tgt)]
2012-05-22 09:42:22,630 DEBUG org.apache.hadoop.ipc.SecureClient: closing ipc
connection to HOST-10-18-40-19/10.18.40.19:60020:
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException:
No valid credentials provided (Mechanism level: Failed to find any Kerberos
tgt)]
java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
[Caused by GSSException: No valid credentials provided (Mechanism level: Failed
to find any Kerberos tgt)]
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection$1.run(SecureClient.java:227)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37)
at org.apache.hadoop.hbase.security.User.call(User.java:586)
at org.apache.hadoop.hbase.security.User.access$700(User.java:50)
at
org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:440)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.handleSaslConnectionFailure(SecureClient.java:194)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:274)
at
org.apache.hadoop.hbase.ipc.SecureClient.getConnection(SecureClient.java:485)
at
org.apache.hadoop.hbase.ipc.SecureClient.getConnection(SecureClient.java:69)
at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:897)
at
org.apache.hadoop.hbase.ipc.SecureRpcEngine$Invoker.invoke(SecureRpcEngine.java:164)
at $Proxy6.getProtocolVersion(Unknown Source)
at
org.apache.hadoop.hbase.ipc.SecureRpcEngine.getProxy(SecureRpcEngine.java:208)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:303)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:280)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:332)
at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:236)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1284)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1240)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1227)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:936)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:832)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:801)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:933)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:836)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:801)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
at hbase.test.Hbasetest.main(Hbasetest.java:37)
Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to find
any Kerberos tgt)]
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:194)
at
org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:138)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupSaslConnection(SecureClient.java:176)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.access$500(SecureClient.java:84)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection$2.run(SecureClient.java:267)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection$2.run(SecureClient.java:264)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37)
at org.apache.hadoop.hbase.security.User.call(User.java:586)
at org.apache.hadoop.hbase.security.User.access$700(User.java:50)
at
org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:440)
at
org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:263)
... 23 more
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed
to find any Kerberos tgt)
at
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:130)
at
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:106)
at
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:172)
at
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:209)
at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:195)
at
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:162)
at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:175)
... 40 more
2012-05-22 09:42:22,636 DEBUG org.apache.hadoop.ipc.SecureClient: IPC Client
(1778276127) connection to HOST-10-18-40-19/10.18.40.19:60020 from testuser:
closed
2012-05-22 09:42:22,638 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
locateRegionInMeta parentTable=-ROOT-, metaLocation={region=-ROOT-,,0.70236052,
hostname=HOST-10-18-40-19, port=60020}, attempt=0 of 120 failed; retrying after
sleep of 1000 because: javax.security.sasl.SaslException: GSS initiate failed
[Caused by GSSException: No valid credentials provided (Mechanism level: Failed
to find any Kerberos tgt)]
2012-05-22 09:42:22,640 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Looked up root region location,
connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@6ecf829d;
serverName=HOST-10-18-40-19,60020,1337574445438
2012-05-22 09:42:23,641 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Looked up root region location,
connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@6ecf829d;
serverName=HOST-10-18-40-19,60020,1337574445438
2012-05-22 09:42:23,642 DEBUG org.apache.hadoop.ipc.SecureClient: RPC Server
Kerberos principal name for
protocol=org.apache.hadoop.hbase.ipc.HRegionInterface is hbase/[email protected]
{noformat}
Other details:
HBase version: 0.94.0
Hadoop version: 0.23.1
Kerberos version: 1.10.1
Java version: 1.6.0_31, 64 bit
Linux version: SuSE 11.1 [Kernel version : 2.6.32.12-0.7-default x86_64
GNU/Linux]
We had gone thru the solutions available @
http://docs.oracle.com/javase/1.5.0/docs/guide/security/jgss/tutorials/Troubleshooting.html
https://ccp.cloudera.com/display/CDHDOC/Appendix+A+-+Troubleshooting#AppendixA-Troubleshooting-Problem2%3AJavaisunabletoreadtheKerberoscredentialscachecreatedbyversionsofMITKerberos1.8.1orhigher.
But none of then seems to work. Any clues?
> Discretionary access control
> ----------------------------
>
> Key: HBASE-1697
> URL: https://issues.apache.org/jira/browse/HBASE-1697
> Project: HBase
> Issue Type: Improvement
> Components: security
> Reporter: Andrew Purtell
> Assignee: Andrew Purtell
>
> Consider implementing discretionary access control for HBase.
> Access control has three aspects: authentication, authorization and audit.
> - Authentication: Access is controlled by insisting on an authentication
> procedure to establish the identity of the user. The authentication procedure
> should minimally require a non-plaintext authentication factor (e.g.
> encrypted password with salt) and should ideally or at least optionally
> provide cryptographically strong confidence via public key certification.
> - Authorization: Access is controlled by specifying rights to resources via
> an access control list (ACL). An ACL is a list of permissions attached to an
> object. The list specifies who or what is allowed to access the object and
> what operations are allowed to be performed on the object, f.e. create,
> update, read, or delete.
> - Audit: Important actions taken by subjects should be logged for
> accountability, a chronological record which enables the full reconstruction
> and examination of a sequence of events, e.g. schema changes or data
> mutations. Logging activity should be protected from all subjects except for
> a restricted set with administrative privilege, perhaps to only a single
> super-user.
> Discretionary access control means the access policy for an object is
> determined by the owner of the object. Every object in the system must have a
> valid owner. Owners can assign access rights and permissions to other users.
> The initial owner of an object is the subject who created it. If subjects are
> deleted from a system, ownership of objects owned by them should revert to
> some super-user or otherwise valid default.
> HBase can enforce access policy at table, column family, or cell granularity.
> Cell granularity does not make much sense. An implementation which controls
> access at both the table and column family levels is recommended, though a
> first cut could consider control at the table level only. The initial set of
> permissions can be: Create (table schema or column family), update (table
> schema or column family), read (column family), delete (table or column
> family), execute (filters), and transfer ownership. The subject identities
> and access tokens could be stored in a new administrative table. ACLs on
> tables and column families can be stored in META.
> Access other than read access to catalog and administrative tables should be
> restricted to a set of administrative users or perhaps a single super-user. A
> data mutation on a user table by a subject without administrative or
> superuser privilege which results in a table split is an implicit temporary
> privilege elevation where the regionserver or master updates the catalog
> tables as necessary to support the split.
> Audit logging should be configurable on a per-table basis to avoid this
> overhead where it is not wanted.
> Consider supporting external authentication and subject identification
> mechanisms with Java library support: RADIUS/TACACS, Kerberos, LDAP.
> Consider logging audit trails to an HBase table (bigtable type schemas are
> natural for this) and optionally external logging options with Java library
> support -- syslog, etc., or maybe commons-logging is sufficient and punt to
> administrator to set up appropriate commons-logging/log4j configurations for
> their needs.
> If HBASE-1002 is considered, and the option to support filtering via upload
> of (perhaps complex) bytecode produced by some little language compiler is
> implemented, the execute privilege could be extended in a manner similar to
> how stored procedures in SQL land execute either with the privilege of the
> current user or the (table/procedure) creator.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira