[ 
https://issues.apache.org/jira/browse/HDFS-14845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16934589#comment-16934589
 ] 

Eric Yang commented on HDFS-14845:
----------------------------------

[~Prabhu Joseph] Thank you for the patch.  I tested with these sets of 
configuration and both can work as long as I define 
hadoop.http.authentication.signature.secret.file.

{code}
    <property>
      <name>hadoop.http.authentication.type</name>
      <value>kerberos</value>
    </property>

    <property>
      <name>hadoop.http.authentication.kerberos.principal</name>
      <value>HTTP/host1.example....@example.com</value>
    </property>

    <property>
      <name>hadoop.http.authentication.kerberos.keytab</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>

    <property>
      <name>hadoop.http.authentication.signature.secret.file</name>
      <value>${httpfs.config.dir}/httpfs-signature.secret</value>
    </property>

    <property>
      <name>hadoop.http.filter.initializers</name>
      
<value>org.apache.hadoop.security.authentication.server.ProxyUserAuthenticationFilterInitializer,org.apache.hadoop.security.HttpCrossOriginFilterInitializer</value>
    </property>

    <property>
      <name>hadoop.authentication.type</name>
      <value>kerberos</value>
    </property>

    <property>
      <name>httpfs.hadoop.authentication.type</name>
      <value>kerberos</value>
    </property>

    <property>
      <name>httpfs.hadoop.authentication.kerberos.principal</name>
      <value>nn/host1.example....@example.com</value>
    </property>

    <property>
      <name>httpfs.hadoop.authentication.kerberos.keytab</name>
      <value>/etc/security/keytabs/hdfs.service.keytab</value>
    </property>
{code}

Backward compatible config also works:
{code}
    <property>
      <name>hadoop.http.authentication.type</name>
      <value>kerberos</value>
    </property>

    <property>
      <name>httpfs.authentication.signature.secret.file</name>
      <value>${httpfs.config.dir}/httpfs-signature.secret</value>
    </property>

    <property>
      <name>hadoop.http.filter.initializers</name>
      
<value>org.apache.hadoop.security.authentication.server.ProxyUserAuthenticationFilterInitializer,org.apache.hadoop.security.HttpCrossOriginFilterInitializer</value>
    </property>

    <property>
      <name>httpfs.authentication.type</name>
      <value>kerberos</value>
    </property>

    <property>
      <name>httpfs.hadoop.authentication.type</name>
      <value>kerberos</value>
    </property>

    <property>
      <name>httpfs.authentication.kerberos.principal</name>
      <value>HTTP/host-1.example....@example.com</value>
    </property>

    <property>
      <name>httpfs.authentication.kerberos.keytab</name>
      <value>/etc/security/keytabs/spnego.service.keytab</value>
    </property>

    <property>
      <name>httpfs.hadoop.authentication.kerberos.principal</name>
      <value>nn/host-1.example....@example.com</value>
    </property>

    <property>
      <name>httpfs.hadoop.authentication.kerberos.keytab</name>
      <value>/etc/security/keytabs/hdfs.service.keytab</value>
    </property>
{code}

When httpfs.authentication.signature.secret.file is undefined in 
httpfs-site.xml, httpfs server doesn't work.

{code}
Exception in thread "main" java.io.IOException: Unable to initialize 
WebAppContext
        at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1198)
        at 
org.apache.hadoop.fs.http.server.HttpFSServerWebServer.start(HttpFSServerWebServer.java:154)
        at 
org.apache.hadoop.fs.http.server.HttpFSServerWebServer.main(HttpFSServerWebServer.java:187)
Caused by: java.lang.RuntimeException: Undefined property: signature.secret.file
        at 
org.apache.hadoop.fs.http.server.HttpFSAuthenticationFilter.getConfiguration(HttpFSAuthenticationFilter.java:95)
        at 
org.apache.hadoop.security.authentication.server.AuthenticationFilter.init(AuthenticationFilter.java:160)
        at 
org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticationFilter.init(DelegationTokenAuthenticationFilter.java:180)
        at 
org.eclipse.jetty.servlet.FilterHolder.initialize(FilterHolder.java:139)
        at 
org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:881)
        at 
org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:349)
        at 
org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1406)
        at 
org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1368)
        at 
org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:778)
        at 
org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:262)
        at 
org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:522)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
        at 
org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
        at 
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:113)
        at 
org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
        at 
org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
        at org.eclipse.jetty.server.Server.start(Server.java:427)
        at 
org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:105)
        at 
org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
        at org.eclipse.jetty.server.Server.doStart(Server.java:394)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
        at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1171)
        ... 2 more
{code}

This is the only delta between full backward compatible from config point of 
view.  I don't have a preference to keep or remove 
httpfs.authentication.signature.secret.file in httpfs-default.xml, but I will 
let other comment if this is ok.  In addition, we have need to list the 
httpfs.authenticate* as deprecated configuration in documentation.  Thank you

> Request is a replay (34) error in httpfs
> ----------------------------------------
>
>                 Key: HDFS-14845
>                 URL: https://issues.apache.org/jira/browse/HDFS-14845
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: httpfs
>    Affects Versions: 3.3.0
>         Environment: Kerberos and ZKDelgationTokenSecretManager enabled in 
> HttpFS
>            Reporter: Akira Ajisaka
>            Assignee: Prabhu Joseph
>            Priority: Critical
>         Attachments: HDFS-14845-001.patch, HDFS-14845-002.patch, 
> HDFS-14845-003.patch
>
>
> We are facing "Request is a replay (34)" error when accessing to HDFS via 
> httpfs on trunk.
> {noformat}
> % curl -i --negotiate -u : "https://<host>:4443/webhdfs/v1/?op=liststatus"
> HTTP/1.1 401 Authentication required
> Date: Mon, 09 Sep 2019 06:00:04 GMT
> Date: Mon, 09 Sep 2019 06:00:04 GMT
> Pragma: no-cache
> X-Content-Type-Options: nosniff
> X-XSS-Protection: 1; mode=block
> WWW-Authenticate: Negotiate
> Set-Cookie: hadoop.auth=; Path=/; Secure; HttpOnly
> Cache-Control: must-revalidate,no-cache,no-store
> Content-Type: text/html;charset=iso-8859-1
> Content-Length: 271
> HTTP/1.1 403 GSSException: Failure unspecified at GSS-API level (Mechanism 
> level: Request is a replay (34))
> Date: Mon, 09 Sep 2019 06:00:04 GMT
> Date: Mon, 09 Sep 2019 06:00:04 GMT
> Pragma: no-cache
> X-Content-Type-Options: nosniff
> X-XSS-Protection: 1; mode=block
> (snip)
> Set-Cookie: hadoop.auth=; Path=/; Secure; HttpOnly
> Cache-Control: must-revalidate,no-cache,no-store
> Content-Type: text/html;charset=iso-8859-1
> Content-Length: 413
> <html>
> <head>
> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
> <title>Error 403 GSSException: Failure unspecified at GSS-API level 
> (Mechanism level: Request is a replay (34))</title>
> </head>
> <body><h2>HTTP ERROR 403</h2>
> <p>Problem accessing /webhdfs/v1/. Reason:
> <pre>    GSSException: Failure unspecified at GSS-API level (Mechanism level: 
> Request is a replay (34))</pre></p>
> </body>
> </html>
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to