[ 
https://issues.apache.org/jira/browse/HADOOP-18594?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17677893#comment-17677893
 ] 

ASF GitHub Bot commented on HADOOP-18594:
-----------------------------------------

hadoop-yetus commented on PR #5304:
URL: https://github.com/apache/hadoop/pull/5304#issuecomment-1385862039

   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m 27s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  1s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | -1 :x: |  test4tests  |   0m  0s |  |  The patch doesn't appear to include 
any new or modified tests. Please justify why no new tests are needed for this 
patch. Also please list what manual steps were performed to verify this patch.  
|
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  17m  6s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  33m 49s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  27m 31s |  |  trunk passed with JDK 
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04  |
   | +1 :green_heart: |  compile  |  23m  7s |  |  trunk passed with JDK 
Private Build-1.8.0_352-8u352-ga-1~20.04-b08  |
   | +1 :green_heart: |  checkstyle  |   4m  5s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 21s |  |  trunk passed  |
   | -1 :x: |  javadoc  |   1m  9s | 
[/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5304/1/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04.txt)
 |  hadoop-common in trunk failed with JDK 
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04.  |
   | +1 :green_heart: |  javadoc  |   2m 24s |  |  trunk passed with JDK 
Private Build-1.8.0_352-8u352-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   6m 17s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  29m 23s |  |  branch has no errors 
when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 24s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 38s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  24m 52s |  |  the patch passed with JDK 
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04  |
   | +1 :green_heart: |  javac  |  24m 52s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  22m  6s |  |  the patch passed with JDK 
Private Build-1.8.0_352-8u352-ga-1~20.04-b08  |
   | +1 :green_heart: |  javac  |  22m  6s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 53s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 14s |  |  the patch passed  |
   | -1 :x: |  javadoc  |   1m  2s | 
[/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5304/1/artifact/out/patch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04.txt)
 |  hadoop-common in the patch failed with JDK 
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04.  |
   | +1 :green_heart: |  javadoc  |   2m 22s |  |  the patch passed with JDK 
Private Build-1.8.0_352-8u352-ga-1~20.04-b08  |
   | +1 :green_heart: |  spotbugs  |   6m 27s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  29m 38s |  |  patch has no errors 
when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m 12s |  |  hadoop-common in the patch 
passed.  |
   | -1 :x: |  unit  | 411m  1s | 
[/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5304/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt)
 |  hadoop-hdfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  5s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 677m 39s |  |  |
   
   
   | Reason | Tests |
   |-------:|:------|
   | Failed junit tests | hadoop.hdfs.TestLeaseRecovery2 |
   |   | hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5304/1/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/5304 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 3fc5f9b24937 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24 
18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 13e543671496dff9043ea4070a131433e1eab617 |
   | Default Java | Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04 
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5304/1/testReport/ |
   | Max. process+thread count | 3144 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common 
hadoop-hdfs-project/hadoop-hdfs U: . |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5304/1/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> ProxyUserAuthenticationFilter add properties 
> 'hadoop.security.impersonation.provider.class'  to enable  load custom 
> ImpersonationProvider class when start namenode
> -------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-18594
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18594
>             Project: Hadoop Common
>          Issue Type: Wish
>            Reporter: Xie Yi
>            Priority: Minor
>              Labels: pull-request-available
>
> h3. h3.  the phenomenon
> I made a custom  ImpersonationProvider class and configured in core-site.xml
> {code:none}
>     <property>
>       <name>hadoop.security.impersonation.provider.class</name>
>       
> <value>org.apache.hadoop.security.authorize.MyImpersonationProvider</value>
>     </property>
> {code}
>  
> {color:#ff0000}However, when  start namenode, MyImpersonationProvider could't 
> be load automatically, but DefaultImpersonationProvider is loaded.{color}
> When execute the following command, custom ImpersonationProvider could be 
> load.
> {code:java}
> bin/hdfs dfsadmin -refreshSuperUserGroupsConfiguration{code}
> h3. h3. what I see else
> custom ImpersonationProvider was load in 
> org.apache.hadoop.security.authorize.ProxyUsers#refreshSuperUserGroupsConfiguration
> through the property "hadoop.security.impersonation.provider.class"
> [https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authorize/ProxyUsers.java#L70]
> {code:java}
> public static void refreshSuperUserGroupsConfiguration(Configuration conf,
>     String proxyUserPrefix) {
>   Preconditions.checkArgument(proxyUserPrefix != null && 
>       !proxyUserPrefix.isEmpty(), "prefix cannot be NULL or empty");
>   // sip is volatile. Any assignment to it as well as the object's state
>   // will be visible to all the other threads. 
>   ImpersonationProvider ip = getInstance(conf);
>   ip.init(proxyUserPrefix);
>   sip = ip;
>   ProxyServers.refresh(conf);
> } 
> private static ImpersonationProvider getInstance(Configuration conf) {
>   Class<? extends ImpersonationProvider> clazz =
>       conf.getClass(
>           
> CommonConfigurationKeysPublic.HADOOP_SECURITY_IMPERSONATION_PROVIDER_CLASS,
>           DefaultImpersonationProvider.class, ImpersonationProvider.class);
>   return ReflectionUtils.newInstance(clazz, conf);
> }{code}
>  
> when namenode start, refreshSuperUserGroupsConfiguration was called in 
> ProxyUserAuthenticationFilter,
> [https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authentication/server/ProxyUserAuthenticationFilter.java#L56]
> {code:java}
>   public void init(FilterConfig filterConfig) throws ServletException {
>     Configuration conf = getProxyuserConfiguration(filterConfig);
>     ProxyUsers.refreshSuperUserGroupsConfiguration(conf, PROXYUSER_PREFIX);
>     super.init(filterConfig);
>   }
> {code}
> here is the stack trace
> {code:none}
> init:70, DefaultImpersonationProvider (org.apache.hadoop.security.authorize)
> refreshSuperUserGroupsConfiguration:77, ProxyUsers 
> (org.apache.hadoop.security.authorize)
> init:56, ProxyUserAuthenticationFilter 
> (org.apache.hadoop.security.authentication.server)
> initialize:140, FilterHolder (org.eclipse.jetty.servlet)
> lambda$initialize$0:731, ServletHandler (org.eclipse.jetty.servlet)
> accept:-1, 1541075662 (org.eclipse.jetty.servlet.ServletHandler$$Lambda$36)
> forEachRemaining:948, Spliterators$ArraySpliterator (java.util)
> forEachRemaining:742, Streams$ConcatSpliterator (java.util.stream)
> forEach:580, ReferencePipeline$Head (java.util.stream)
> initialize:755, ServletHandler (org.eclipse.jetty.servlet)
> startContext:379, ServletContextHandler (org.eclipse.jetty.servlet)
> doStart:910, ContextHandler (org.eclipse.jetty.server.handler)
> doStart:288, ServletContextHandler (org.eclipse.jetty.servlet)
> start:73, AbstractLifeCycle (org.eclipse.jetty.util.component)
> start:169, ContainerLifeCycle (org.eclipse.jetty.util.component)
> doStart:117, ContainerLifeCycle (org.eclipse.jetty.util.component)
> doStart:97, AbstractHandler (org.eclipse.jetty.server.handler)
> start:73, AbstractLifeCycle (org.eclipse.jetty.util.component)
> start:169, ContainerLifeCycle (org.eclipse.jetty.util.component)
> doStart:117, ContainerLifeCycle (org.eclipse.jetty.util.component)
> doStart:97, AbstractHandler (org.eclipse.jetty.server.handler)
> start:73, AbstractLifeCycle (org.eclipse.jetty.util.component)
> start:169, ContainerLifeCycle (org.eclipse.jetty.util.component)
> start:423, Server (org.eclipse.jetty.server)
> doStart:110, ContainerLifeCycle (org.eclipse.jetty.util.component)
> doStart:97, AbstractHandler (org.eclipse.jetty.server.handler)
> doStart:387, Server (org.eclipse.jetty.server)
> start:73, AbstractLifeCycle (org.eclipse.jetty.util.component)
> start:1276, HttpServer2 (org.apache.hadoop.http)
> start:170, NameNodeHttpServer (org.apache.hadoop.hdfs.server.namenode)
> startHttpServer:954, NameNode (org.apache.hadoop.hdfs.server.namenode)
> initialize:765, NameNode (org.apache.hadoop.hdfs.server.namenode)
> <init>:1020, NameNode (org.apache.hadoop.hdfs.server.namenode)
> <init>:995, NameNode (org.apache.hadoop.hdfs.server.namenode)
> createNameNode:1769, NameNode (org.apache.hadoop.hdfs.server.namenode)
> main:1834, NameNode (org.apache.hadoop.hdfs.server.namenode)
> {code}
>  
> {color:#ff0000}but the filterConfig in ProxyUserAuthenticationFilter did't 
> contains properties ''hadoop.security.impersonation.provider.class''{color}
> filterConfig in ProxyUserAuthenticationFilter is controled by 
> ProxyUserAuthenticationFilterInitializer or AuthFilterInitializer
> filterConfig only put property which start with "hadoop.proxyuser", but not 
> put "hadoop.security.impersonation.provider.class"
> {code:java}
>   protected Map<String, String> createFilterConfig(Configuration conf) {
>     Map<String, String> filterConfig = AuthenticationFilterInitializer
>         .getFilterConfigMap(conf, configPrefix);
>     //Add proxy user configs
>     for (Map.Entry<String, String> entry : conf.getPropsWithPrefix(
>         ProxyUsers.CONF_HADOOP_PROXYUSER).entrySet()) {
>       filterConfig.put("proxyuser" + entry.getKey(), entry.getValue());
>     }
>     return filterConfig;
>   }
> {code}
> [https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/authentication/server/ProxyUserAuthenticationFilterInitializer.java#L46]
> [https://github.com/apache/hadoop/blob/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/web/AuthFilterInitializer.java#L46]
> it leads to custome ImpersonationProvider can't be load during namenode start.
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to