[ 
https://issues.apache.org/jira/browse/HADOOP-15995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16715560#comment-16715560
 ] 

Larry McCay commented on HADOOP-15995:
--------------------------------------

I see [~lukmajercak], so can we make a change where we add the notion of the 
provider qualified properties? Where the caller will need to know which 
provider they are accessing and make a call for properties more closely 
resembling:

hadoop.security.group.mapping.provider.a.ldap.bind.password
hadoop.security.group.mapping.provider.b.ldap.bind.password

That would leave the use of a single LdapGroupsMapping and existing deployments 
whole while accommodating the extended semantics that you need. Note: I have 
not dug into who would need that context and how it would be acquired yet.

> LdapGroupsMapping should use the bind.password config value as credential 
> alias
> -------------------------------------------------------------------------------
>
>                 Key: HADOOP-15995
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15995
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: common
>            Reporter: Lukas Majercak
>            Assignee: Lukas Majercak
>            Priority: Major
>         Attachments: HADOOP-15995.001.patch
>
>
> Currently, the property name hadoop.security.group.mapping.ldap.bind.password 
> is used as an alias to get password from CredentialProviders. This has a big 
> issue, which is that when we configure multiple LdapGroupsMapping providers 
> through CompositeGroupsMapping, they will all have the same alias, and won't 
> be able to be distinguished. The proposal is to use the value of the property 
> instead, which would fix this issue.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to