[
https://issues.apache.org/jira/browse/HADOOP-16958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17081679#comment-17081679
]
Ctest edited comment on HADOOP-16958 at 4/12/20, 7:27 AM:
----------------------------------------------------------
Sorry for not removing the unused Bool import which caused the javac warning
and one of the checkstyle problems.
I submitted the 004.patch which should fix it and other checkstyle problems,
except for one:
the line with
CommonConfigurationKeys.HADOOP_SECURITY_SERVICE_AUTHORIZATION_REFRESH_POLICY is
over 80 chars and I am unable to reduce it to 80 chars with proper indentation
and code logic. I can solve it by changing it into its value
"security.refresh.policy.protocol.acl", but it could be confusing to read at
times and won't be updated if the value of
CommonConfigurationKeys.HADOOP_SECURITY_SERVICE_AUTHORIZATION_REFRESH_POLICY is
changed.
was (Author: ctest.team):
Sorry for not removing the unused Bool import which caused the javac warning
and one of the checkstyle problems.
I submitted the 004.patch which should fix it and other checkstyle problems,
except for one:
the line with
CommonConfigurationKeys.HADOOP_SECURITY_SERVICE_AUTHORIZATION_REFRESH_POLICY is
over 80 chars and I am unable to reduce it to 80 chars with proper indentation
and code logic. I can solve it by changing it into its value
"security.refresh.policy.protocol.acl", but it could be confusing to read at
times and won't be updated if the value of
CommonConfigurationKeys.HADOOP_SECURITY_SERVICE_AUTHORIZATION_REFRESH_POLICY is
changed.
> NullPointerException(NPE) when hadoop.security.authorization is enabled but
> the input PolicyProvider for ZKFCRpcServer is NULL
> ------------------------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-16958
> URL: https://issues.apache.org/jira/browse/HADOOP-16958
> Project: Hadoop Common
> Issue Type: Bug
> Components: common, ha
> Affects Versions: 3.2.1
> Reporter: Ctest
> Priority: Critical
> Attachments: HADOOP-16958.000.patch, HADOOP-16958.001.patch,
> HADOOP-16958.002.patch, HADOOP-16958.003.patch, HADOOP-16958.004.patch
>
>
> During initialization, ZKFCRpcServer refreshes the service authorization ACL
> for the service handled by this server if config
> hadoop.security.authorization is enabled, by calling refreshServiceAcl with
> the input PolicyProvider and Configuration.
> {code:java}
> ZKFCRpcServer(Configuration conf,
> InetSocketAddress bindAddr,
> ZKFailoverController zkfc,
> PolicyProvider policy) throws IOException {
> this.server = ...
>
> // set service-level authorization security policy
> if (conf.getBoolean(
> CommonConfigurationKeys.HADOOP_SECURITY_AUTHORIZATION, false)) {
> server.refreshServiceAcl(conf, policy);
> }
> }{code}
> refreshServiceAcl calls
> ServiceAuthorizationManager#refreshWithLoadedConfiguration which directly
> gets services from the provider with provider.getServices(). When the
> provider is NULL, the code throws NPE without an informative message. In
> addition, the default value of config
> `hadoop.security.authorization.policyprovider` (which controls PolicyProvider
> here) is NULL and the only usage of ZKFCRpcServer initializer provides only
> an abstract method getPolicyProvider which does not enforce that
> PolicyProvider should not be NULL.
> The suggestion here is to either add a guard check or exception handling with
> an informative logging message on ZKFCRpcServer to handle input
> PolicyProvider being NULL.
>
> I am very happy to provide a patch for it if the issue is confirmed :)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]