Re: how to execute HDFS file actions from shell action on Kerberized Cluster

2016-11-25 Thread Peter Cseh
Hi,

The best way is to split up the shell script and use FS action
s,
Hive action
s and
other specific actions in the workflow.
This way you can define the credentials
 and let
Oozie handle the authentication for you.
If you want to do it all in a shell script, you will have to make sure the
keytab is accessible on all machines in the cluster and handle
authentication from the shell script by yourself.

BRs
gp



On Thu, Nov 24, 2016 at 10:13 PM, Aniruddh Sharma 
wrote:

> Hello
>
> I know if one has to execute hive action from shell , then one can do
> something like this
>  hive -e "SET mapreduce.job.credentials.binary=$HADOOP_TOKEN_FILE_
> LOCATION;
> select * from test"
>
>
>  My requirement is to execute hdfs fs actions from shell.
> for example "hadoop fs -get /user/abc/d.txt"
>
> But it fails because of Kerberos. How I can use HADOOP_TOKEN_FILE_LOCATION
> to authenticate for HDFS file actions ?
>
> Thanks and Regards
> Aniruddh
>



-- 
Peter Cseh
Software Engineer



how to execute HDFS file actions from shell action on Kerberized Cluster

2016-11-24 Thread Aniruddh Sharma
Hello

I know if one has to execute hive action from shell , then one can do
something like this
 hive -e "SET mapreduce.job.credentials.binary=$HADOOP_TOKEN_FILE_LOCATION;
select * from test"


 My requirement is to execute hdfs fs actions from shell.
for example "hadoop fs -get /user/abc/d.txt"

But it fails because of Kerberos. How I can use HADOOP_TOKEN_FILE_LOCATION
to authenticate for HDFS file actions ?

Thanks and Regards
Aniruddh