For the benefit of anyone who comes across this error in future, it was solved 
by adding hive.metastore.sasl.enabled and hive.metastore.kerberos.principal to 
hive-site.xml on the client side, e.g. $SPARK_HOME/conf


> On 8 Jan 2024, at 16:18, Austin Hackett <hacketta...@me.com> wrote:
> 
> Hi List
>  
> I'm having an issue where Hive Metastore operations (e.g. show databases) are 
> failing with "org.apache.thrift.transport.TTransportException: Invalid status 
> -128" errors when I enable SASL.
>  
> I am a bit stuck on how to go about troubleshooting this further, and any 
> pointers would be greatly apprecicated...
>  
> Full details as follows:
>  
> - Ubuntu 22.04 & OpenJDK 8u342
> - Unpacked Hive 3.1.3 binary release 
> (https://dlcdn.apache.org/hive/hive-3.1.3/apache-hive-3.1.3-bin.tar.gz) to 
> /opt/hive
> - Unpacked Hadoop 3.1.0 binary release 
> (https://archive.apache.org/dist/hadoop/common/hadoop-3.1.0/hadoop-3.1.0.tar.gz)
>  to /opt/hadoop
> - Created /opt/hive/conf/metastore-site.xml (see below for contents) and 
> copied hdfs-site.xml and core-site.xml from the target HDFS cluster to 
> /opt/hive/conf
> - export HADOOP_HOME=/opt/hadoop
> - export HIVE_HOME=/opt/hive
> - Successfully started the metastore, i.e. hive --service metastore
> - Use a Hive Metastore client to "show databases" and get an error (see below 
> for the associated errors in the HMS log). I get the same error with 
> spark-shell running in local mode and the Python hive-metastore-client 
> (https://pypi.org/project/hive-metastore-client/)
>  
>  
> metastore-site.xml
> ==================
> <configuration>
>   <property>
>     <name>metastore.warehouse.dir</name>
>     <value>/user/hive/warehouse</value>
>   </property>
>   <property>
>     <name>javax.jdo.option.ConnectionDriverName</name>
>     <value>org.postgresql.Driver</value>
>   </property>
>   <property>
>     <name>javax.jdo.option.ConnectionURL</name>
>     <value>jdbc:postgresql://postgres.example.net:5432/metastore_db</value> 
> <postgresql://postgres.example.net:5432/metastore_db%3C/value%3E>
>   </property>
>   <property>
>     <name>javax.jdo.option.ConnectionUserName</name>
>     <value>hive</value>
>   </property>
>   <property>
>     <name>javax.jdo.option.ConnectionPassword</name>
>     <value>password</value>
>   </property>
>   <property>
>     <name>metastore.kerberos.principal</name>
>     <value>hive/_h...@example.net</value 
> <mailto:hive/_h...@example.net%3c/value>>
>   </property>
>   <property>
>     <name>metastore.kerberos.keytab.file</name>
>     <value>/etc/security/keytabs/hive.keytab</value>
>   </property>
>   <property>
>     <name>hive.metastore.sasl.enabled</name>
>     <value>true</value>
>   </property>
> </configuration>
> ==================
>  
> HMS log shows that it is able to authenticate using the specified keytab and 
> principle (and I have also checked this manually via kinit command):
>  
> ====
> 2024-01-08T13:12:33,463  WARN [main] security.HadoopThriftAuthBridge: 
> Client-facing principal not set. Using server-side setting: 
> hive/_h...@example.net <mailto:hive/_h...@example.net>
> 2024-01-08T13:12:33,464  INFO [main] security.HadoopThriftAuthBridge: Logging 
> in via CLIENT based principal
> 2024-01-08T13:12:33,471 DEBUG [main] security.UserGroupInformation: Hadoop 
> login
> 2024-01-08T13:12:33,472 DEBUG [main] security.UserGroupInformation: hadoop 
> login commit
> 2024-01-08T13:12:33,472 DEBUG [main] security.UserGroupInformation: Using 
> kerberos user: hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>
> 2024-01-08T13:12:33,472 DEBUG [main] security.UserGroupInformation: Using 
> user: "hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>" with name: 
> hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>
> 2024-01-08T13:12:33,472 DEBUG [main] security.UserGroupInformation: User 
> entry: "hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>"
> 2024-01-08T13:12:33,472  INFO [main] security.UserGroupInformation: Login 
> successful for user hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net> using keytab file hive.keytab. 
> Keytab auto renewal enabled : false
> 2024-01-08T13:12:33,472  INFO [main] security.HadoopThriftAuthBridge: Logging 
> in via SERVER based principal
> 2024-01-08T13:12:33,480 DEBUG [main] security.UserGroupInformation: Hadoop 
> login
> 2024-01-08T13:12:33,480 DEBUG [main] security.UserGroupInformation: hadoop 
> login commit
> 2024-01-08T13:12:33,480 DEBUG [main] security.UserGroupInformation: Using 
> kerberos user: hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>
> 2024-01-08T13:12:33,480 DEBUG [main] security.UserGroupInformation: Using 
> user: "hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>" with name: 
> hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>
> 2024-01-08T13:12:33,480 DEBUG [main] security.UserGroupInformation: User 
> entry: "hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net>"
> 2024-01-08T13:12:33,480  INFO [main] security.UserGroupInformation: Login 
> successful for user hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net> using keytab file hive.keytab. 
> Keytab auto renewal enabled : false
> ====
>  
> However, when i attempt to "show databases":
>  
> ====
> 2024-01-08T13:59:08,068 DEBUG [pool-6-thread-1] 
> security.UserGroupInformation: PrivilegedAction [as: 
> hive/metstore.example....@example.net 
> <mailto:hive/metstore.example....@example.net> 
> (auth:KERBEROS)][action:org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1@1e655c9
>  
> <mailto:org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1@1e655c9>]
> java.lang.Exception: null
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1873)
>  [hadoop-common-3.3.6.jar:?]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:691)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  [?:?]
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  [?:?]
>         at java.lang.Thread.run(Thread.java:829) [?:?]
> 2024-01-08T13:59:08,072 DEBUG [pool-6-thread-1] 
> transport.TSaslServerTransport: transport map does not contain key
> 2024-01-08T13:59:08,074 DEBUG [pool-6-thread-1] transport.TSaslTransport: 
> opening transport org.apache.thrift.transport.TSaslServerTransport@513e2d00 
> <mailto:org.apache.thrift.transport.TSaslServerTransport@513e2d00>
> 2024-01-08T13:59:08,215 DEBUG [pool-6-thread-1] transport.TSaslTransport: 
> SERVER: Writing message with status ERROR and payload length 19
> 2024-01-08T13:59:08,215 DEBUG [pool-6-thread-1] 
> transport.TSaslServerTransport: failed to open server transport
> org.apache.thrift.transport.TTransportException: Invalid status -128
>         at 
> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) 
> ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:694)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:691)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
>         at javax.security.auth.Subject.doAs(Subject.java:361) [?:?]
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)
>  [hadoop-common-3.3.6.jar:?]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:691)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  [?:?]
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  [?:?]
>         at java.lang.Thread.run(Thread.java:829) [?:?]
> 2024-01-08T13:59:08,216 ERROR [pool-6-thread-1] server.TThreadPoolServer: 
> Error occurred during processing of message.
> java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: 
> Invalid status -128
>         at 
> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:694)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:691)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at java.security.AccessController.doPrivileged(Native Method) ~[?:?]
>         at javax.security.auth.Subject.doAs(Subject.java:361) ~[?:?]
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)
>  ~[hadoop-common-3.3.6.jar:?]
>         at 
> org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:691)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
>  [hive-exec-3.1.3.jar:3.1.3]
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  [?:?]
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  [?:?]
>         at java.lang.Thread.run(Thread.java:829) [?:?]
> Caused by: org.apache.thrift.transport.TTransportException: Invalid status 
> -128
>         at 
> org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) 
> ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         at 
> org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
>  ~[hive-exec-3.1.3.jar:3.1.3]
>         ... 10 more
> ====

Reply via email to