Hi Sailaja,

I set these configuration parameters, still i get the following error on
Ranger UI as well as in ranger_admin.log file


Unable to retrieve any files using given parameters, You can still save the
repository and start creating policies, but you would not be able to use
autocomplete for resource names. Check ranger_admin.log for more info.

org.apache.ranger.plugin.client.HadoopException: Unable to login to Hadoop
environment [hdfs].
Unable to login to Hadoop environment [hdfs].
Unable to decrypt password due to error.
Input length must be multiple of 8 when decrypting with padded cipher.

Following are the configurations:

Service Name  hdfs
username        admin
password        admin
*Namenode URL   hdfs://hadoop-master:8020 <http://192.168.23.206:8020/>*
*Authorization Enabled   ===> true*
*Authentication Type ==> kerberos*
*hadoop.security.auth_to_local  ====> **RULE:[2:$1@$0]([nd]n@.*realm)s/.*/hdfs/
        RULE:[2:$1@$0](hbase@.*realm)s/.*/hbase/
RULE:[2:$1@$0](mapred@.*realm)s/.*/mapred/
RULE:[2:$1@$0](yarn@.*realm)s/@.*/yarn/         DEFAULT*
*dfs.datanode.kerberos.principal  ====> dn/_HOST@platalyticsrealm*
*dfs.namenode.kerberos.principal   ===> **nn/_HOST@platalyticsrealm*
*dfs.secondary.namenode.kerberos.principal ==> **nn/_HOST@platalyticsrealm*
*RPC Protection Type    ==> authentication*

On Fri, Aug 12, 2016 at 3:11 AM, Sailaja Polavarapu <
spolavar...@hortonworks.com> wrote:

> Hi Aneela,
>  As far as I know the following properties should be same as the once
> configured under HDFS configuration and should not be empty:
> *hadoop.security.auth_to_local  ====> empty*
> *dfs.datanode.kerberos.principal  ====> empty*
> *dfs.namenode.kerberos.principal   ===> empty*
> *dfs.secondary.namenode.kerberos.principal ==> empty*
> *RPC Protection Type    ==> privacy*
>
>
> From: Aneela Saleem <ane...@platalytics.com>
> Reply-To: "user@ranger.incubator.apache.org" <
> user@ranger.incubator.apache.org>
> Date: Thursday, August 11, 2016 at 12:01 PM
>
> To: "user@ranger.incubator.apache.org" <user@ranger.incubator.apache.org>
> Subject: Re: Ranger-0.6 HDFS authentication failed in secure mode
>
> Hi,
>
> When I test connection, following error is shown
>
> Unable to retrieve any files using given parameters, You can still save
> the repository and start creating policies, but you would not be able to
> use autocomplete for resource names. Check ranger_admin.log for more info.
>
> org.apache.ranger.plugin.client.HadoopException: Unable to login to
> Hadoop environment [hdfs].
> Unable to login to Hadoop environment [hdfs].
> Unable to decrypt password due to error.
> Input length must be multiple of 8 when decrypting with padded cipher.
>
> Here are configurations of my repository.
>
> Service Name  hdfs
> username        admin
> password        admin
> *Namenode URL   hdfs://192.168.23.206:8020 <http://192.168.23.206:8020>*
> *Authorization Enabled   ===> true*
> *Authentication Type ==> kerberos*
> *hadoop.security.auth_to_local  ====> empty*
> *dfs.datanode.kerberos.principal  ====> empty*
> *dfs.namenode.kerberos.principal   ===> empty*
> *dfs.secondary.namenode.kerberos.principal ==> empty*
> *RPC Protection Type    ==> privacy*
>
> In ranger 0.6 there is no xa_portal log file. Ranger-admin.log file has
> no error when i start ranger admin.
>
>
>
> On Thu, Aug 11, 2016 at 11:15 PM, Velmurugan Periasamy <
> vperias...@hortonworks.com> wrote:
>
>> Error you posted seems to be related to test connection failing, not
>> download policy issue. @Sailaja – can you please chime in for the decrypt
>> password issue?
>>
>> Can you please share 1] your HDFS repository configuration 2] any errors
>> in ranger log during the download policy from HDFS plugin
>>
>> Thanks,
>> Vel
>>
>> From: Aneela Saleem <ane...@platalytics.com>
>> Reply-To: "user@ranger.incubator.apache.org" <
>> user@ranger.incubator.apache.org>
>> Date: Thursday, August 11, 2016 at 11:32 PM
>> To: "user@ranger.incubator.apache.org" <user@ranger.incubator.apache.org>
>> Subject: Re: Ranger-0.6 HDFS authentication failed in secure mode
>>
>> Hi Folks!
>>
>> I have tried different options like kinit using nn/hadoop-master
>> principal. And then enable hdfs plugin and start hadoop. But I am still
>> facing the same issue. Any help related to above issue will be appreciable.
>>
>> Thanks
>>
>> On Mon, Aug 8, 2016 at 8:47 PM, Aneela Saleem <ane...@platalytics.com>
>> wrote:
>>
>>> Madhan!
>>>
>>> I can see following exception in *ranger-admin.log* file
>>>
>>> 2016-08-08 17:42:43,501 [timed-executor-pool-0] ERROR
>>> apache.ranger.services.hdfs.client.HdfsResourceMgr
>>> (HdfsResourceMgr.java:49) - <== HdfsResourceMgr.testConnection Error:
>>> Unable to login to Hadoop environment [hdfs]
>>> org.apache.ranger.plugin.client.HadoopException: Unable to login to
>>> Hadoop environment [hdfs]
>>>         at org.apache.ranger.plugin.client.BaseClient.login(BaseClient.
>>> java:136)
>>>         at org.apache.ranger.plugin.client.BaseClient.<init>(BaseClient
>>> .java:59)
>>>         at org.apache.ranger.services.hdfs.client.HdfsClient.<init>(Hdf
>>> sClient.java:52)
>>>         at org.apache.ranger.services.hdfs.client.HdfsClient.connection
>>> Test(HdfsClient.java:221)
>>>         at org.apache.ranger.services.hdfs.client.HdfsResourceMgr.conne
>>> ctionTest(HdfsResourceMgr.java:47)
>>>         at org.apache.ranger.services.hdfs.RangerServiceHdfs.validateCo
>>> nfig(RangerServiceHdfs.java:58)
>>>         at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall
>>> (ServiceMgr.java:560)
>>>         at org.apache.ranger.biz.ServiceMgr$ValidateCallable.actualCall
>>> (ServiceMgr.java:547)
>>>         at org.apache.ranger.biz.ServiceMgr$TimedCallable.call(ServiceM
>>> gr.java:508)
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>>> Executor.java:1145)
>>>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>>> lExecutor.java:615)
>>>         at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.io.IOException: Unable to decrypt password due to error
>>>         at org.apache.ranger.plugin.util.PasswordUtils.decryptPassword(
>>> PasswordUtils.java:128)
>>>         at org.apache.ranger.plugin.client.BaseClient.login(BaseClient.
>>> java:113)
>>>         ... 12 more
>>> Caused by: javax.crypto.IllegalBlockSizeException: Input length must be
>>> multiple of 8 when decrypting with padded cipher
>>>         at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:7
>>> 50)
>>>         at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:6
>>> 76)
>>>         at com.sun.crypto.provider.PBECipherCore.doFinal(PBECipherCore.
>>> java:422)
>>>         at com.sun.crypto.provider.PBEWithMD5AndDESCipher.engineDoFinal
>>> (PBEWithMD5AndDESCipher.java:316)
>>>         at javax.crypto.Cipher.doFinal(Cipher.java:2131)
>>>         at org.apache.ranger.plugin.util.PasswordUtils.decryptPassword(
>>> PasswordUtils.java:112)
>>>         ... 13 more
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Aug 8, 2016 at 8:16 PM, Madhan Neethiraj <mad...@apache.org>
>>> wrote:
>>>
>>>> Aneela,
>>>>
>>>>
>>>>
>>>> Do you see any errors reported in Ranger Admin log file xa_portal.log,
>>>> for the download request from the HDFS plugin?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Madhan
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *From: *Aneela Saleem <ane...@platalytics.com>
>>>> *Reply-To: *"user@ranger.incubator.apache.org" <
>>>> user@ranger.incubator.apache.org>
>>>> *Date: *Monday, August 8, 2016 at 6:05 AM
>>>> *To: *"user@ranger.incubator.apache.org" <user@ranger.incubator.apache.
>>>> org>
>>>> *Subject: *Ranger-0.6 HDFS authentication failed in secure mode
>>>>
>>>>
>>>>
>>>> Hi all,
>>>>
>>>>
>>>>
>>>> I have installed Ranger-0.6 version, i successfully installed the
>>>> usersync process. Now i'm trying to enable HDFS plugin on Kerberized Hadoop
>>>> Cluster. When is restart Hadoop after enabling the plugin, i get the
>>>> following error:
>>>>
>>>>
>>>>
>>>> 2016-08-08 17:56:55,675 ERROR 
>>>> org.apache.ranger.admin.client.RangerAdminRESTClient:
>>>> Error getting policies. secureMode=true, 
>>>> user=nn/hadoop-master@platalyticsrealm
>>>> (auth:KERBEROS), response={"httpStatusCode":401
>>>> ,"statusCode":401,"msgDesc":"Authentication Failed"}, serviceName=hdfs
>>>>
>>>> 2016-08-08 17:56:55,675 ERROR 
>>>> org.apache.ranger.plugin.util.PolicyRefresher:
>>>> PolicyRefresher(serviceName=hdfs): failed to refresh policies. Will
>>>> continue to use last known version of policies (-1)
>>>>
>>>> java.lang.Exception: Authentication Failed
>>>>
>>>> at org.apache.ranger.admin.client.RangerAdminRESTClient.getServ
>>>> icePoliciesIfUpdated(RangerAdminRESTClient.java:126)
>>>>
>>>> at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicyfrom
>>>> PolicyAdmin(PolicyRefresher.java:217)
>>>>
>>>> at org.apache.ranger.plugin.util.PolicyRefresher.loadPolicy(Pol
>>>> icyRefresher.java:185)
>>>>
>>>> at org.apache.ranger.plugin.util.PolicyRefresher.run(PolicyRefr
>>>> esher.java:158)
>>>>
>>>> 2016-08-08 17:56:55,676 WARN org.apache.ranger.plugin.util.PolicyRefresher:
>>>> cache file does not exist or not readable '/etc/ranger/hdfs/policycache/
>>>> hdfs_hdfs.json'
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> Although i have a running Kerberized Hadoop cluster and
>>>> nn/hadoop-master@platalyticsrealm user authenticates successfully
>>>> within Hadoop, then why the authentication is failed here ?
>>>>
>>>
>>>
>>
>

Reply via email to