Hi,

The best way is to split up the shell script and use FS action
<https://oozie.apache.org/docs/4.2.0/WorkflowFunctionalSpec.html#a3.2.4_Fs_HDFS_action>s,
Hive action
<https://oozie.apache.org/docs/4.2.0/DG_HiveActionExtension.html>s and
other specific actions in the workflow.
This way you can define the credentials
<https://oozie.apache.org/docs/4.2.0/DG_ActionAuthentication.html> and let
Oozie handle the authentication for you.
If you want to do it all in a shell script, you will have to make sure the
keytab is accessible on all machines in the cluster and handle
authentication from the shell script by yourself.

BRs
gp



On Thu, Nov 24, 2016 at 10:13 PM, Aniruddh Sharma <asharma...@gmail.com>
wrote:

> Hello
>
> I know if one has to execute hive action from shell , then one can do
> something like this
>  hive -e "SET mapreduce.job.credentials.binary=$HADOOP_TOKEN_FILE_
> LOCATION;
> select * from test"
>
>
>  My requirement is to execute hdfs fs actions from shell.
> for example "hadoop fs -get /user/abc/d.txt"
>
> But it fails because of Kerberos. How I can use HADOOP_TOKEN_FILE_LOCATION
> to authenticate for HDFS file actions ?
>
> Thanks and Regards
> Aniruddh
>



-- 
Peter Cseh
Software Engineer
<http://www.cloudera.com>

Reply via email to