Attached the hive-site.xml configuration file.

________________________________
From: Vivek Shrivastava <vivshrivast...@gmail.com>
Sent: Monday, January 30, 2017 4:10:42 PM
To: user@hive.apache.org
Subject: Re: Pls Help me - Hive Kerberos Issue

If this is working then your kerberos setup is ok. I suspect configuration is 
Hiveserver2. What is the authentication and security setup in Hive config? 
Please see if you can attach it.

On Mon, Jan 30, 2017 at 2:33 PM, Ricardo Fajardo 
<ricardo.faja...@autodesk.com<mailto:ricardo.faja...@autodesk.com>> wrote:

[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ hadoop fs -ls
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Found 20 items
drwxr-xr-x   - cloudera cloudera          0 2016-06-13 17:51 checkpoint
-rw-r--r--   1 cloudera cloudera       3249 2016-05-11 16:19 hadoop.txt
drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:15 hadoop2.txt
drwxr-xr-x   - cloudera cloudera          0 2016-06-02 16:30 hadoop3.txt
drwxr-xr-x - cloudera cloudera 0 2016-06-16 16:37 gives
drwxr-xr-x   - cloudera cloudera          0 2016-06-16 16:06 out1
-rw-r--r--   1 cloudera cloudera       3868 2016-06-15 08:39 post.small0.xml
drwxr-xr-x   - cloudera cloudera          0 2016-07-14 17:01 tCount1
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 15:57 test1
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:57 test10
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 17:33 test12
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:02 test2
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:24 test3
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:27 test4
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:32 test5
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:37 test6
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:49 test7
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:51 test8
drwxr-xr-x   - cloudera cloudera          0 2016-06-21 16:54 test9
-rw-r--r--   1 cloudera cloudera    8481022 2016-06-08 21:51 train.tsv
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ echo $HADOOP_OPTS
-Dsun.security.krb5.debug=true
[cloudera@quickstart bin]$


________________________________
From: Vivek Shrivastava 
<vivshrivast...@gmail.com<mailto:vivshrivast...@gmail.com>>
Sent: Monday, January 30, 2017 2:28:53 PM

To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

If you are using AES256, then please do update java unlimited strength jar 
files. What is the output of hadoop ls command after exporting the below 
environment variable?

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /

On Mon, Jan 30, 2017 at 2:21 PM, Ricardo Fajardo 
<ricardo.faja...@autodesk.com<mailto:ricardo.faja...@autodesk.com>> wrote:

I did the changes but I am getting the same error.

Klist:

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fa...@ads.autodesk.com<mailto:t_fa...@ads.autodesk.com>

Valid starting     Expires            Service principal
01/30/17 11:56:20  01/30/17 21:56:24  
krbtgt/ads.autodesk....@ads.autodesk.com<mailto:ads.autodesk....@ads.autodesk.com>
renew until 01/31/17 11:56:20, Flags: FPRIA
Etype (skey, tkt): aes256-cts-hmac-sha1-96, arcfour-hmac


Log:

[cloudera@quickstart bin]$ export 
HADOOP_OPTS="-Dsun.security.kr<http://Dsun.security.kr>b5.debug=true"
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$
[cloudera@quickstart bin]$ ./beeline -u 
"jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com<mailto:h...@ads.autodesk.com>;hive.server2.proxy.user=t_fajar"
/home/cloudera/workspace/hive/bin/hive: line 99: [: 
/home/cloudera/workspace/hive/lib/hive-exec-2.2.0-SNAPSHOT-core.jar: binary 
operator expected
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/home/cloudera/workspace/hive/lib/benchmarks.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/cloudera/workspace/hive/lib/hive-jdbc-2.2.0-SNAPSHOT-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/cloudera/workspace/hive/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/cloudera/workspace/hive/lib/spark-examples-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Connecting to 
jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com<mailto:h...@ads.autodesk.com>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
17/01/30 12:08:59 [main]: ERROR transport.TSaslTransport: SASL negotiation 
failure
javax.security.sasl.SaslException: GSS initiate failed
at 
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
 ~[?:1.8.0_73]
at 
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
 ~[benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) 
[benchmarks.jar:2.2.0-SNAPSHOT]
at 
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 [benchmarks.jar:2.2.0-SNAPSHOT]
at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
 [benchmarks.jar:2.2.0-SNAPSHOT]
at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
 [benchmarks.jar:2.2.0-SNAPSHOT]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_73]
at 
javax.security.auth.Subject.do<http://javax.security.auth.Subject.do>As(Subject.java:422)
 [?:1.8.0_73]
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
 [benchmarks.jar:2.2.0-SNAPSHOT]
at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
 [benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) 
[hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) 
[hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) 
[hive-jdbc-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_73]
at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_73]
at 
org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at 
org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
 [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_73]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at 
org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
 [hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:797) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:885) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) 
[hive-beeline-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_73]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_73]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_73]
at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_73]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
[benchmarks.jar:2.2.0-SNAPSHOT]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
[benchmarks.jar:2.2.0-SNAPSHOT]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism 
level: Failed to find any Kerberos tgt)
at 
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
 ~[?:1.8.0_73]
at 
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
 ~[?:1.8.0_73]
at 
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
 ~[?:1.8.0_73]
at 
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) 
~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) 
~[?:1.8.0_73]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) 
~[?:1.8.0_73]
at 
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
 ~[?:1.8.0_73]
... 35 more
17/01/30 12:08:59 [main]: WARN jdbc.HiveConnection: Failed to connect to 
localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com<mailto:h...@ads.autodesk.com>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar:
 GSS initiate failed (state=08S01,code=0)
Beeline version 2.2.0-SNAPSHOT by Apache Hive
beeline>



________________________________
From: Vivek Shrivastava 
<vivshrivast...@gmail.com<mailto:vivshrivast...@gmail.com>>
Sent: Monday, January 30, 2017 11:34:27 AM

To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

You can comment both default_tkt_enctypes and default_tgs_enctypes out, the 
default value will become aes256-cts-hmac-sha1-96AES128-CTS-HMAC-SHA1-96 des3-c 
BC-SHA1 arcfour-HMAC-MD5 camel lia256 CTS-CMAC camellia128-CT with CMAC- 
des-cbc-crc des-cbc-md5 des-cbc-MD4  .
Then do
kdestroy
kinit
klist -fev
your beeline command

if still does not work then paste the output of

export HADOOP_OPTS="-Dsun.security.krb5.debug=true"
hadoop fs -ls /



On Mon, Jan 30, 2017 at 11:11 AM, Ricardo Fajardo 
<ricardo.faja...@autodesk.com<mailto:ricardo.faja...@autodesk.com>> wrote:

I don't have any particular reason for selecting arcfour encryption type. If I 
need to change it and it will work I can do.

Values from krb5.conf:

[Libdefaults]
        default_realm = ADS.AUTODESK.COM<http://ADS.AUTODESK.COM>
        krb4_config = /etc/krb.conf
        krb4_realms = /etc/krb.realms
        kdc_timesync = 1
        ccache_type = 4
        forwardable = true
        proxiable = true
        v4_instance_resolve = false
        v4_name_convert = {
                host = {
                        rcmd = host
                        ftp = ftp
                }
                plain = {
                        something = something-else
                }
        }
        fcc-mit-ticketflags = true
        default_tkt_enctypes = RC4 HMAC-des-cbc-crc of-CBC-MD5 AES256-CTS
        default_tgs_enctypes = RC4-HMAC des-cbc-crc des-cbc-md5 AES256-CTS

[realms]

        ADS.AUTODESK.COM<http://ADS.AUTODESK.COM> = {
                kdc = krb.ads.autodesk.com<http://ads.autodesk.com>: 88
                admin_server = krb.ads.autodesk.com<http://ads.autodesk.com>
                default_domain = ads.autodesk.com<http://ads.autodesk.com>
                database_module = openldap_ldapconf
                master_key_type = aes256-cts
                supported_enctypes = aes256-cts:normal aes128-cts:normal 
des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal 
des-cbc-md5:normal des-cbc-crc:normal
                default_principal_flags = +preauth
        }

Thanks so much for your help,
Richard.
________________________________
From: Vivek Shrivastava 
<vivshrivast...@gmail.com<mailto:vivshrivast...@gmail.com>>
Sent: Monday, January 30, 2017 11:01:24 AM

To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Any particular reason for selecting arcfour encryption type? Could you please 
post defaults (e.g enc_type) values from krb5.conf

On Mon, Jan 30, 2017 at 10:57 AM, Ricardo Fajardo 
<ricardo.faja...@autodesk.com<mailto:ricardo.faja...@autodesk.com>> wrote:

1. klist -fe

[cloudera@quickstart bin]$ klist -fe
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: t_fa...@ads.autodesk.com<mailto:t_fa...@ads.autodesk.com>

Valid starting     Expires            Service principal
01/30/17 10:52:37  01/30/17 20:52:43  
krbtgt/ads.autodesk....@ads.autodesk.com<mailto:ads.autodesk....@ads.autodesk.com>
renew until 01/31/17 10:52:37, Flags: FPRIA
Etype (skey, tkt): arcfour-hmac, arcfour-hmac
[cloudera@quickstart bin]$

2. relevant entries from HiveServer2 log


beeline> !connect 
jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com<mailto:h...@ads.autodesk.com>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
!connect jdbc:hive2://localhost:10000/default;principal=hive/_HOST@ADS.
AUTODESK.COM<http://AUTODESK.COM>;hive.server2.proxy.user=t_fajar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/home/cloudera/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/home/cloudera/.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to 
jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com<mailto:h...@ads.autodesk.com>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar
17/01/27 16:16:36 INFO Utils: Supplied authorities: localhost:10000
17/01/27 16:16:36 INFO Utils: Resolved authority: localhost:10000
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with 
annotation 
@org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time,
 value=[Rate of successful kerberos logins and latency (milliseconds)], about=, 
type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with 
annotation 
@org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time,
 value=[Rate of failed kerberos logins and latency (milliseconds)], about=, 
type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MutableMetricsFactory: field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with 
annotation 
@org.apache.hadoop.metrics2.an<http://org.apache.hadoop.metrics2.an>notation.Metric(valueName=Time,
 value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
17/01/27 16:16:36 DEBUG MetricsSystemImpl: UgiMetrics, User and group related 
metrics
17/01/27 16:16:37 DEBUG Shell: setsid exited with exit code 0
17/01/27 16:16:37 DEBUG Groups:  Creating new Groups object
17/01/27 16:16:37 DEBUG NativeCodeLoader: Trying to load the custom-built 
native-hadoop library...
17/01/27 16:16:37 DEBUG NativeCodeLoader: Failed to load native-hadoop with 
error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/01/27 16:16:37 DEBUG NativeCodeLoader: 
java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/01/27 16:16:37 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
17/01/27 16:16:37 DEBUG PerformanceAdvisory: Falling back to shell based
17/01/27 16:16:37 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping 
impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
17/01/27 16:16:38 DEBUG Groups: Group mapping 
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; 
cacheTimeout=300000; warningDeltaMs=5000
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login
17/01/27 16:16:38 DEBUG UserGroupInformation: hadoop login commit
17/01/27 16:16:38 DEBUG UserGroupInformation: using local user:UnixPrincipal: 
cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: Using user: "UnixPrincipal: 
cloudera" with name cloudera
17/01/27 16:16:38 DEBUG UserGroupInformation: User entry: "cloudera"
17/01/27 16:16:56 DEBUG UserGroupInformation: UGI loginUser:cloudera 
(auth:SIMPLE)
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Current authMethod = SIMPLE
17/01/27 16:16:56 DEBUG HadoopThriftAuthBridge: Setting UGI conf as passed-in 
authMethod of kerberos != current.
17/01/30 10:24:45 DEBUG UserGroupInformation: PrivilegedAction as:cloudera 
(auth:SIMPLE) 
from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.HadoopThriftAuthBridge$Client.createClientTransport(HadoopThriftAuthBridge.java:208)
17/01/30 10:55:02 DEBUG UserGroupInformation: PrivilegedAction as:cloudera 
(auth:SIMPLE) 
from:org.apache.hadoop.hive.th<http://org.apache.hadoop.hive.th>rift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
17/01/30 10:55:02 DEBUG TSaslTransport: opening transport 
org.apache.thrift.transport.TSaslClientTransport@1119f7c5
17/01/30 10:55:02 ERROR TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
at 
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
 ~[?:1.7.0_67]
at 
org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
 ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) 
[libthrift-0.9.3.jar:0.9.3]
at 
org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
 [libthrift-0.9.3.jar:0.9.3]
at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
 [classes/:?]
at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:1)
 [classes/:?]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_67]
at 
javax.security.auth.Subject.do<http://javax.security.auth.Subject.do>As(Subject.java:415)
 [?:1.7.0_67]
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
 [hadoop-common-2.7.2.jar:?]
at 
org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
 [classes/:?]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:227) 
[classes/:?]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:182) 
[classes/:?]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [classes/:?]
at java.sql.DriverManager.getConnection(DriverManager.java:571) [?:1.7.0_67]
at java.sql.DriverManager.getConnection(DriverManager.java:187) [?:1.7.0_67]
at 
org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) 
[classes/:?]
at 
org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
 [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1524) [classes/:?]
at org.apache.hive.beeline.Commands.connect(Commands.java:1419) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_67]
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
~[?:1.7.0_67]
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.7.0_67]
at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_67]
at 
org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
 [classes/:?]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1127) 
[classes/:?]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1166) [classes/:?]
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:999) [classes/:?]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:909) [classes/:?]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:511) 
[classes/:?]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:494) [classes/:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism 
level: Failed to find any Kerberos tgt)
at 
sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
 ~[?:1.7.0_67]
at 
sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
 ~[?:1.7.0_67]
at 
sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
 ~[?:1.7.0_67]
at 
sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) 
~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) 
~[?:1.7.0_67]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) 
~[?:1.7.0_67]
at 
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
 ~[?:1.7.0_67]
... 29 more
17/01/30 10:55:02 DEBUG TSaslTransport: CLIENT: Writing message with status BAD 
and payload length 19
17/01/30 10:55:02 WARN HiveConnection: Failed to connect to localhost:10000
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: 
jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com<mailto:h...@ads.autodesk.com>;hive.server2.pr<http://hive.server2.pr>oxy.user=t_fajar:
 GSS initiate failed (state=08S01,code=0)
beeline>


________________________________
From: Vivek Shrivastava 
<vivshrivast...@gmail.com<mailto:vivshrivast...@gmail.com>>
Sent: Monday, January 30, 2017 10:48:35 AM
To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: Re: Pls Help me - Hive Kerberos Issue

Please paste the output of
1. klist -fe
2. relevant entries from HiveServer2 log

On Mon, Jan 30, 2017 at 10:11 AM, Ricardo Fajardo 
<ricardo.faja...@autodesk.com<mailto:ricardo.faja...@autodesk.com>> wrote:

I could not resolve the problem.


I have debugged the code and I found out that:


1. On the org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge class   line 208

....

UserGroupInformation.getCurrentUser return (). Two (....

..

This method always returns the user of the operative system but and I need 
authenticate the user set on the property: hive.server2.proxy.user=yourid 
because I have a token for this one.


2. I have found out that the hive.server2.proxy.user is implemented on the 
org.apache.hive.jdbc.HiveConnection class method: openSession() but this code 
is never executed.


3. On the org.apache.hive.service.auth.HiveAuthFactory class there is this code 
on the method getAuthTransFactory():

....

      if (authTypeStr.equalsIgnoreCase(AuthTypes.KERBEROS.getAuthName())) {
        // no-op
....


It means that Kerberos authentication is not implemented?



Please anyone can help me??


Thanks,

Richard.

________________________________
From: Dulam, Naresh 
<naresh.du...@bankofamerica.com<mailto:naresh.du...@bankofamerica.com>>
Sent: Thursday, January 26, 2017 8:41:48 AM
To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: RE: Pls Help me - Hive Kerberos Issue


Kinit   yourid -k -t your.keytab you...@my-realm.com<mailto:you...@my-realm.com>

# Connect using following JDBC connection string
# 
jdbc:hive2://myHost.myOrg.com:10000/default;principal=hive/_h...@my-realm.com;hive.server2.proxy.user=yourid<http://myHost.myOrg.com:10000/default;principal=hive/_h...@my-realm.com;hive.server2.proxy.user=yourid>






From: Ricardo Fajardo 
[mailto:ricardo.faja...@autodesk.com<mailto:ricardo.faja...@autodesk.com>]
Sent: Thursday, January 26, 2017 1:37 AM
To: user@hive.apache.org<mailto:user@hive.apache.org>
Subject: Pls Help me - Hive Kerberos Issue

Hello,



Please I need your help with the Kerberos authentication with Hive.



I am following this guide:

https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cdh_sg_hiveserver2_security.html#topic_9_1_1

But I am getting this error:

Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism 
level: Failed to find any Kerberos tgt)


I have a remote Kerberos server and I can generate a token with kinit for my 
user. I created a keytab file with my passwd for my user. Please tell me if it 
is ok.

On the another hand when I am debugging the hive code the operative system user 
is authenticated but I need authenticate my Kerberos user, can you tell me how 
I can achieve that? How can I store my tickets where Hive can load it?? or How 
can I verify where Hive is searching the tickets and what Hive is reading??

Thanks so much for your help.

Best regards,
Richard.


________________________________
This message, and any attachments, is for the intended recipient(s) only, may 
contain information that is privileged, confidential and/or proprietary and 
subject to important terms and conditions available at 
http://www.bankofamerica.com/emaildisclaimer. If you are not the intended 
recipient, please delete this message.





HiveMetaStore:

desde /hive/metasotere

mvn datanucleus:enhance
mvn datanucleus:enhance-check

Conectar Mysql:

mysql -u hive2 -p
>use metastore3
>show tables;
>select * from VERSION;

Eclipse:

HiveMetaStore (1)

VM Arguments:

-Dhive.log.dir=/home/cloudera/workspace/hive/target/log 
-Dhive.log.file=hive-metastore.log -Dhive.log.threshold=DEBUG 
-Dhadoop.root.logger=DEBUG,console

Classpath:

HIVE_HADOOP_CLASSPATH - /usr/lib/hive/lib (agregar todas las *.jar)


Cliente: Beeline


########################################################################################################################

2016-09-14 08:21:30,063 WARN  [main] DataNucleus.Query 
(Log4JLogger.java:warn(106)) - Query for candidates of 
org.apache.hadoop.hive.metastore.model.MVersionTable and subclasses resulted in 
no possible candidates
Persistent class "org.apache.hadoop.hive.metastore.model.MVersionTable" has no 
table in the database, but the operation requires it. Please check the 
specification of the MetaData for this class.
org.datanucleus.store.rdbms.exceptions.NoTableManagedException: Persistent 
class "org.apache.hadoop.hive.metastore.model.MVersionTable" has no table in 
the database, but the operation requires it. Please check the specification of 
the MetaData for this class.
        at 
org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:694)
        at 
org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:425)
        at 
org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:864)
        at 
org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:346)
        at org.datanucleus.store.query.Query.executeQuery(Query.java:1805)
        at org.datanucleus.store.query.Query.executeWithArray(Query.java:1733)
        at org.datanucleus.store.query.Query.execute(Query.java:1715)
        at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:371)
        at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:213)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7836)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(Obje
        
        
        
        
mysql> desc  VERSION;
+-----------------+--------------+------+-----+---------+-------+
| Field           | Type         | Null | Key | Default | Extra |
+-----------------+--------------+------+-----+---------+-------+
| VER_ID          | bigint(20)   | NO   | PRI | NULL    |       |
| SCHEMA_VERSION  | varchar(127) | NO   |     | NULL    |       |
| VERSION_COMMENT | varchar(255) | YES  |     | NULL    |       |
+-----------------+--------------+------+-----+---------+-------+


CREATE TABLE IF NOT EXISTS `BUCKETING_COLS` (
  `SD_ID` bigint(20) NOT NULL,
  `BUCKET_COL_NAME` varchar(256) CHARACTER SET latin1 COLLATE latin1_bin 
DEFAULT NULL,
  `INTEGER_IDX` int(11) NOT NULL,
  PRIMARY KEY (`SD_ID`,`INTEGER_IDX`),
  KEY `BUCKETING_COLS_N49` (`SD_ID`),
  CONSTRAINT `BUCKETING_COLS_FK1` FOREIGN KEY (`SD_ID`) REFERENCES `SDS` 
(`SD_ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;





-Dhive.log.dir=/home/cloudera/workspace/hive/target/log 
-Dhive.log.file=hive-server2.log -Dhive.log.threshold=DEBUG 
-Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log 
-Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= 
-Dhadoop.root.logger=DEBUG,console 
-Djava.library.path=/usr/lib/hadoop/lib/native 
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true 
-Dhadoop.security.logger=DEBUG,NullAppender

-Dhive.log.dir=/home/cloudera/workspace/hive/target/log 
-Dhive.log.file=hive-metastore.log -Dhive.log.threshold=DEBUG 
-Dhadoop.root.logger=DEBUG,console

-Dhive.log.dir=/home/cloudera/workspace/hive/target/log 
-Dhive.log.file=hive-metastore.log -Dhive.log.threshold=DEBUG 
-Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log 
-Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= 
-Dhadoop.root.logger=DEBUG,console 
-Djava.library.path=/usr/lib/hadoop/lib/native 
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true 
-Dhadoop.security.logger=DEBUG,NullAppender


spark     4241     1  0 Sep14 ?        00:01:06 
/usr/java/jdk1.7.0_67-cloudera/bin/java -cp 
/etc/spark/conf/:/usr/lib/spark/lib/spark-assembly-1.5.0-cdh5.5.0-hadoop2.6.0-cdh5.5.0.jar:/etc/hadoop/conf/:/usr/lib/spark/lib/spark-assembly.jar:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/*:/usr/lib/hadoop-mapreduce/lib/*:/usr/lib/hadoop-mapreduce/*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/*:/usr/lib/hive/lib/*:/usr/lib/flume-ng/lib/*:/usr/lib/paquet/lib/*:/usr/lib/avro/lib/*
 -Dspark.history.fs.logDirectory=hdfs:///user/spark/applicationHistory 
-Dspark.history.ui.port=18088 -Xms1g -Xmx1g -XX:MaxPermSize=256m 
org.apache.spark.deploy.history.HistoryServer
root     12254     1  0 Sep14 pts/0    00:00:04 gedit 
/home/cloudera/workspace/hive/metastore/scripts/upgrade/mysql/hive-schema-0.10.0.mysql.sql

hive     25625     1  0 07:46 ?        00:00:10 
/usr/java/jdk1.7.0_67-cloudera/bin/java -Xmx256m -Dhive.log.dir=/var/log/hive 
-Dhive.log.file=hive-metastore.log -Dhive.log.threshold=DEBUG -Xdebug 
-Xrunjdwp:transport=dt_socket,address=58001,server=y,suspend=n 
-Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log 
-Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= 
-Dhadoop.root.logger=DEBUG,console 
-Djava.library.path=/usr/lib/hadoop/lib/native 
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true 
-Dhadoop.security.logger=DEBUG,NullAppender org.apache.hadoop.util.RunJar 
/usr/lib/hive/lib/hive-service-1.1.0-cdh5.5.0.jar 
org.apache.hadoop.hive.metastore.HiveMetaStore

hive     26603     1  1 08:18 ?        00:00:06 
/usr/java/jdk1.7.0_67-cloudera/bin/java -Xmx256m -Dhive.log.dir=/var/log/hive 
-Dhive.log.file=hive-server2.log -Dhive.log.threshold=INFO 
-Dhadoop.log.dir=/usr/lib/hadoop/logs -Dhadoop.log.file=hadoop.log 
-Dhadoop.home.dir=/usr/lib/hadoop -Dhadoop.id.str= 
-Dhadoop.root.logger=INFO,console 
-Djava.library.path=/usr/lib/hadoop/lib/native 
-Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true 
-Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar 
/usr/lib/hive/lib/hive-service-1.1.0-cdh5.5.0.jar 
org.apache.hive.service.server.HiveServer2



Allows to create a configured SSL wrapped TServerSocket bound to the specified 
port.
Y capaz que en la declaracion de la variable 
InetSocketAddress 


beeline -u 
"jdbc:hive2://localhost:10000/default;principal=krbtgt/ads.autodesk....@ads.autodesk.com;hive.server2.proxy.user=t_fajar"

Following is an example of the keytab file creation process using MIT Kerberos:

  > ktutil
  ktutil:  addent -password -p usern...@ads.iu.edu -k 1 -e rc4-hmac
  Password for usern...@ads.iu.edu: [enter your password]
  ktutil:  addent -password -p usern...@ads.iu.edu -k 1 -e aes256-cts
  Password for usern...@ads.iu.edu: [enter your password]
  ktutil:  wkt username.keytab
  ktutil:  quit

klist -k /home/cloudera/username.keytab


#####################

para correr beeline local copiar todos los *.jar a la carpeta HIVE_HOME/lib


hive/quickstart.cloud...@ads.autodesk.com

-u 
"jdbc:hive2://localhost:10000/default;principal=hive/_h...@ads.autodesk.com;hive.server2.proxy.user=t_fajar"

jdbc:hive2://fastaccess.api.autodesk.com:10008/default;ssl=true;sslTrustStore=/home/cloudera/AdskCA-truststore.jks
hs2SvcUser: svc_p_adpcompute
hs2SvcUserPassword: eY8#xS8@

Props = {javax.security.sasl.server.authentication=true, 
javax.security.sasl.qop=auth-conf,auth-int,auth}

Reply via email to