Resolved: from client machine I need to update the
./etc/hadoop/core-site.xml with site specific info e.g.
<property>
<name>hadoop.security.authentication</name>
<value>kerberos</value>
</property>
also hdfs-site.xml needs:
<property>
<name>dfs.namenode.kerberos.principal</name>
<value>hdfs/_HOST@your-realmname</value>
</property>
<property>
<name>dfs.namenode.kerberos.internal.spnego.principal</name>
<value>HTTP/_HOST@your-realmname</value>
</property>
<property>
<name>dfs.datanode.kerberos.principal</name>
<value>hdfs/_HOST@your-realmname</value>
</property>
Thanks.
Sophia
On Sat, Sep 27, 2014 at 8:13 PM, Liu, Yi A <[email protected]> wrote:
> You should configure "hadoop.security.authentication" to Kerberos in your
> core-site.xml. Please refer to
> http://hadoop.apache.org/docs/r2.5.1/hadoop-project-dist/hadoop-common/SecureMode.html
>
>
> Regards,
> Yi Liu
>
>
> -----Original Message-----
> From: Xiaohua Chen [mailto:[email protected]]
> Sent: Saturday, September 27, 2014 11:51 AM
> To: [email protected]
> Subject: From java application : how to access kerberosed hadoop HDFS ?
>
> Hi ,
>
> We recently has added kerberos on our CDH4 cluster, and our java application
> hit error:
> Caused by: org.apache.hadoop.security.AccessControlException:
> Authorization (hadoop.security.authorization) is enabled but authentication
> (hadoop.security.authentication) is configured as simple. Please configure
> another method like kerberos or digest.
>
> The above error is caused by below code:
>
> mFileSystem = FileSystem.get(uri,new Configuration(), loginUser);
>
> FileStatus[] statusArray = mFileSystem.listStatus(pathName)
>
>
> I am new to this area and can you shed some light on: how to configure the
> authentication method to Kerberos to avoid the above error ?
>
> Thanks and regards,
>
> Sophia