Shaik,
  I think this document might help.[1]  Essentially you need to add
credentials to your workflow and then specify the Hive action to use those
credentials.  In their example they show a Pig action but a Hive one should
be similar.

[1] - https://oozie.apache.org/docs/4.0.0/DG_UnifiedCredentialsModule.html

On Mon, Apr 20, 2015 at 10:31 PM, Shaik M <[email protected]> wrote:

> Hi,
>
> I have recently enabled Hadoop security, we are mostly running Shell
> Actions.
> All hive related tasks are failing. Please let me know how to get
> the delegation token in Oozie ssh action.
>
> I am getting following error after hive2 url modification:
>
> 2015-04-20 17:17:13,949  INFO (org.apache.hive.jdbc.HiveConnection:189)
> [main] - Will try to open client transport with JDBC Uri:
> jdbc:hive2://node1/power_analytics;principal=hive/[email protected]
>
> 2015-04-20 17:17:13,959 ERROR
> (org.apache.thrift.transport.TSaslTransport:296) [main] - SASL negotiation
> failure
>
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
>
>         at
>
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>
>         at
>
> org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
>
>         at
> org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
>
>         at
>
> org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
>
>         at
>
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
>
>         at
>
> org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
>
>
> Regards,
>
> Shaik
>

Reply via email to