[jira] [Commented] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-09 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16119992#comment-16119992
 ] 

Lantao Jin commented on HADOOP-14708:
-

Thanks [~jojochuang]. Maybe the 
[HDFS-3745|https://issues.apache.org/jira/browse/HDFS-3745] could fix my issue 
as well with this code:
{code}
-  /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
+  /** Same as getUGI(context, request, conf, KERBEROS, true). */
   public static UserGroupInformation getUGI(ServletContext context,
   HttpServletRequest request, Configuration conf) throws IOException {
-return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
true);
+return getUGI(context, request, conf, AuthenticationMethod.KERBEROS, true);
   }
{code}
So wait HDFS-3745 to be resolved?

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
>Assignee: Lantao Jin
> Attachments: FSCK-2.log, FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-08 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16119410#comment-16119410
 ] 

Lantao Jin commented on HADOOP-14708:
-

Can the title change to "Allow client with KERBEROS_SSL auth method to 
negotiate to server in security mode"?

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
>Assignee: Lantao Jin
> Attachments: FSCK-2.log, FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-08 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16118429#comment-16118429
 ] 

Lantao Jin commented on HADOOP-14708:
-

Sorry I don't know why the ugi client in NN (in FSCK servlet) use KERBEROS_SSL. 
I guess it inherited from the JspHelper. But I wonder the logical any 
KERBEROS_SSL from rpc can not pass through the NEGOTIATE.
Return {{null}} illustrate that client isn't using kerberos. But KERBEROS_SSL 
is also kerberos, right? Please correct me if I misunderstand.

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK-2.log, FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-08 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16118414#comment-16118414
 ] 

Lantao Jin commented on HADOOP-14708:
-

Hi [~jojochuang], [^FSCK-2.log] is the new log I added some debug code. I use 
user lajin to do FSCK from 192.168.1.22. The namenode which started with user 
hadoop with kerberos is handling this in 192.168.1.1. From the debug log. The 
ugi from DFSClient (in NN) has no tokens in it and its {{AuthenticationMethod}} 
is KERBEROS_SSL. I don't know why but seems the patch I submitted can work 
around.

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK-2.log, FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-08 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14708:

Attachment: FSCK-2.log

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK-2.log, FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-03 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14708:

Status: Patch Available  (was: Open)

Submit a easy patch for #2

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 3.0.0-alpha3, 2.8.1, 2.7.3
>Reporter: Lantao Jin
> Attachments: FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-03 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14708:

Attachment: HADOOP-14708.001.patch

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK.log, HADOOP-14708.001.patch
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-03 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16112351#comment-16112351
 ] 

Lantao Jin commented on HADOOP-14708:
-

Hi [~daryn], since it relates to 
[HADOOP-9010|https://issues.apache.org/jira/browse/HADOOP-9010], what do you 
think about option #1?

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK.log
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-03 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16108653#comment-16108653
 ] 

Lantao Jin edited comment on HADOOP-14708 at 8/3/17 7:17 AM:
-

Attach some logs. I think it can be fix by two options:
# Change the enum {{AuthenticationMethod}} value KERBEROS_SSL(null) to 
KERBEROS_SSL(AuthMethod.KERBEROS)
# relax the condition in createSaslClient() like
{code}
if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
 AuthMethod.KERBEROS && ugi.getRealAuthenticationMethod() != 
 AuthenticationMethod.KERBEROS_SSL ) {
  return null; // client isn't using kerberos
}
{code}

If any option is ok, I can attach the patch file.


was (Author: cltlfcjin):
Attach some logs. I think it can be fix by two options:
# Change the enum {{AuthenticationMethod}} value KERBEROS_SSL(null) to 
KERBEROS_SSL(AuthMethod.KERBEROS)
# relax the condition in createSaslClient() like
{code}
if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
 AuthMethod.KERBEROS || ugi.getRealAuthenticationMethod() != 
 AuthenticationMethod.KERBEROS_SSL ) {
  return null; // client isn't using kerberos
}
{code}

If any option is ok, I can attach the patch file.

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK.log
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-01 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14708:

Attachment: FSCK.log

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK.log
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-01 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14708:

Attachment: (was: FSCK.log)

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-01 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16108745#comment-16108745
 ] 

Lantao Jin commented on HADOOP-14708:
-

Could you help to review this? [~benoyantony]  [~hitliuyi]

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK.log
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-01 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14708?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14708:

Attachment: FSCK.log

Attach some logs. I think it can be fix by two options:
# Change the enum {{AuthenticationMethod}} value KERBEROS_SSL(null) to 
KERBEROS_SSL(AuthMethod.KERBEROS)
# relax the condition in createSaslClient() like
{code}
if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
 AuthMethod.KERBEROS || ugi.getRealAuthenticationMethod() != 
 AuthenticationMethod.KERBEROS_SSL ) {
  return null; // client isn't using kerberos
}
{code}

If any option is ok, I can attach the patch file.

> FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL
> ---
>
> Key: HADOOP-14708
> URL: https://issues.apache.org/jira/browse/HADOOP-14708
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: security
>Affects Versions: 2.7.3, 2.8.1, 3.0.0-alpha3
>Reporter: Lantao Jin
> Attachments: FSCK.log
>
>
> FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
> encountered internal errors!"
> FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as 
> its {{AuthenticationMethod}} in {{JspHelper.java}}
> {code}
>   /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
>   public static UserGroupInformation getUGI(ServletContext context,
>   HttpServletRequest request, Configuration conf) throws IOException {
> return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
> true);
>   }
> {code}
> But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
> SaslClient instance. See {{SaslRpcClient.java}}
> {code}
> private SaslClient createSaslClient(SaslAuth authType)
>   throws SaslException, IOException {
>   
>   case KERBEROS: {
> if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
> AuthMethod.KERBEROS) {
>   return null; // client isn't using kerberos
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-14708) FsckServlet can not create SaslRpcClient with auth KERBEROS_SSL

2017-08-01 Thread Lantao Jin (JIRA)
Lantao Jin created HADOOP-14708:
---

 Summary: FsckServlet can not create SaslRpcClient with auth 
KERBEROS_SSL
 Key: HADOOP-14708
 URL: https://issues.apache.org/jira/browse/HADOOP-14708
 Project: Hadoop Common
  Issue Type: Bug
  Components: security
Affects Versions: 3.0.0-alpha3, 2.8.1, 2.7.3
Reporter: Lantao Jin


FSCK started by xx (auth:KERBEROS_SSL) failed with exception msg "fsck 
encountered internal errors!"

FSCK use FSCKServlet to submit RPC to NameNode, it use {{KERBEROS_SSL}} as its 
{{AuthenticationMethod}} in {{JspHelper.java}}
{code}
  /** Same as getUGI(context, request, conf, KERBEROS_SSL, true). */
  public static UserGroupInformation getUGI(ServletContext context,
  HttpServletRequest request, Configuration conf) throws IOException {
return getUGI(context, request, conf, AuthenticationMethod.KERBEROS_SSL, 
true);
  }
{code}

But when setup SaslConnection with server, KERBEROS_SSL will failed to create 
SaslClient instance. See {{SaslRpcClient.java}}
{code}
private SaslClient createSaslClient(SaslAuth authType)
  throws SaslException, IOException {
  
  case KERBEROS: {
if (ugi.getRealAuthenticationMethod().getAuthMethod() !=
AuthMethod.KERBEROS) {
  return null; // client isn't using kerberos
}
{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14492) RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction cause the Xavgtime confused

2017-06-07 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16040978#comment-16040978
 ] 

Lantao Jin commented on HADOOP-14492:
-

Thanks [~xkrogen] to explain. Yes, our DataNode run with multiple storages. 
Assigning to you is good. I think it better to use a different metrics name 
like {{WholeDNBlockReportAvgTime}}.

> RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction 
> cause the Xavgtime confused
> -
>
> Key: HADOOP-14492
> URL: https://issues.apache.org/jira/browse/HADOOP-14492
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: metrics
>Affects Versions: 2.8.0, 2.7.4
>Reporter: Lantao Jin
>Assignee: Erik Krogen
>Priority: Minor
>
> For performance purpose, 
> [HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
> metrics behaviour in {{RpcDetailedMetrics}}.
> In 2.7.4:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRatesWithAggregation rates;
> {code}
> In old version:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRates rates;
> {code}
> But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
> version:
> {code}
> public class NameNodeMetrics {
>   @Metric("Block report") MutableRate blockReport;
> {code}
> It causes the metrics in JMX very different between them.
> {quote}
> name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
> modelerType: "RpcDetailedActivityForPort8030",
> tag.port: "8030",
> tag.Context: "rpcdetailed",
> ...
> BlockReportNumOps: 237634,
> BlockReportAvgTime: 1382,
> ...
> name: "Hadoop:service=NameNode,name=NameNodeActivity",
> modelerType: "NameNodeActivity",
> tag.ProcessName: "NameNode",
> ...
> BlockReportNumOps: 2592932,
> BlockReportAvgTime: 19.258064516129032,
> ...
> {quote}
> In the old version. They are correct.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-14492) RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction cause the Xavgtime confused

2017-06-05 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16038202#comment-16038202
 ] 

Lantao Jin edited comment on HADOOP-14492 at 6/6/17 5:25 AM:
-

How about changing the {{NameNodeMetrics}} to use 
{{MutableRatesWithAggregation}} instead of {{MutableRates}}? 
Any idea? [~steve_l] [~zhz]


was (Author: cltlfcjin):
How about changing the {{NameNodeMetrics}} to use 
{{MutableRatesWithAggregation}} instead of {{MutableRates}}? 
Any idea? [~steve_l]

> RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction 
> cause the Xavgtime confused
> -
>
> Key: HADOOP-14492
> URL: https://issues.apache.org/jira/browse/HADOOP-14492
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: metrics
>Affects Versions: 2.8.0, 2.7.4
>Reporter: Lantao Jin
>Priority: Minor
>
> For performance purpose, 
> [HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
> metrics behaviour in {{RpcDetailedMetrics}}.
> In 2.7.4:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRatesWithAggregation rates;
> {code}
> In old version:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRates rates;
> {code}
> But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
> version:
> {code}
> public class NameNodeMetrics {
>   @Metric("Block report") MutableRate blockReport;
> {code}
> It causes the metrics in JMX very different between them.
> {quote}
> name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
> modelerType: "RpcDetailedActivityForPort8030",
> tag.port: "8030",
> tag.Context: "rpcdetailed",
> ...
> BlockReportNumOps: 237634,
> BlockReportAvgTime: 1382,
> ...
> name: "Hadoop:service=NameNode,name=NameNodeActivity",
> modelerType: "NameNodeActivity",
> tag.ProcessName: "NameNode",
> ...
> BlockReportNumOps: 2592932,
> BlockReportAvgTime: 19.258064516129032,
> ...
> {quote}
> In the old version. They are correct.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-14492) RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction cause the Xavgtime confused

2017-06-05 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16038202#comment-16038202
 ] 

Lantao Jin commented on HADOOP-14492:
-

How about changing the {{NameNodeMetrics}} to use 
{{MutableRatesWithAggregation}} instead of {{MutableRates}}? 
Any idea? [~steve_l]

> RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction 
> cause the Xavgtime confused
> -
>
> Key: HADOOP-14492
> URL: https://issues.apache.org/jira/browse/HADOOP-14492
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: metrics
>Affects Versions: 2.8.0, 2.7.4
>Reporter: Lantao Jin
>Priority: Minor
>
> For performance purpose, 
> [HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
> metrics behaviour in {{RpcDetailedMetrics}}.
> In 2.7.4:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRatesWithAggregation rates;
> {code}
> In old version:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRates rates;
> {code}
> But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
> version:
> {code}
> public class NameNodeMetrics {
>   @Metric("Block report") MutableRate blockReport;
> {code}
> It causes the metrics in JMX very different between them.
> {quote}
> name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
> modelerType: "RpcDetailedActivityForPort8030",
> tag.port: "8030",
> tag.Context: "rpcdetailed",
> ...
> BlockReportNumOps: 237634,
> BlockReportAvgTime: 1382,
> ...
> name: "Hadoop:service=NameNode,name=NameNodeActivity",
> modelerType: "NameNodeActivity",
> tag.ProcessName: "NameNode",
> ...
> BlockReportNumOps: 2592932,
> BlockReportAvgTime: 19.258064516129032,
> ...
> {quote}
> In the old version. They are correct.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-14492) RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction cause the Xavgtime confused

2017-06-05 Thread Lantao Jin (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-14492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lantao Jin updated HADOOP-14492:

Description: 
For performance purpose, 
[HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
metrics behaviour in {{RpcDetailedMetrics}}.
In 2.7.4:
{code}
public class RpcDetailedMetrics {

  @Metric MutableRatesWithAggregation rates;
{code}
In old version:
{code}
public class RpcDetailedMetrics {

  @Metric MutableRates rates;
{code}

But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
version:
{code}
public class NameNodeMetrics {
  @Metric("Block report") MutableRate blockReport;
{code}

It causes the metrics in JMX very different between them.
{quote}
name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
modelerType: "RpcDetailedActivityForPort8030",
tag.port: "8030",
tag.Context: "rpcdetailed",
...
BlockReportNumOps: 237634,
BlockReportAvgTime: 1382,
...


name: "Hadoop:service=NameNode,name=NameNodeActivity",
modelerType: "NameNodeActivity",
tag.ProcessName: "NameNode",
...
BlockReportNumOps: 2592932,
BlockReportAvgTime: 19.258064516129032,
...

{quote}
In the old version. They are correct.

  was:
For performance purpose, 
[HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
metrics behaviour in {{RpcDetailedMetrics}}.
In 2.7.4:
{code}
public class RpcDetailedMetrics {

  @Metric MutableRatesWithAggregation rates;
{code}
In old version:
{code}
public class RpcDetailedMetrics {

  @Metric MutableRates rates;
{code}

But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
version:
{code}
public class NameNodeMetrics {
  @Metric("Block report") MutableRate blockReport;
{code}

It causes the metrics in JMX very different between them.
{quote}
{
name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
modelerType: "RpcDetailedActivityForPort8030",
tag.port: "8030",
tag.Context: "rpcdetailed",
...
BlockReportNumOps: 237634,
BlockReportAvgTime: 1382,
...
}
{
name: "Hadoop:service=NameNode,name=NameNodeActivity",
modelerType: "NameNodeActivity",
tag.ProcessName: "NameNode",
...
BlockReportNumOps: 2592932,
BlockReportAvgTime: 19.258064516129032,
...
}
{quote}
In the old version. They are correct.


> RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction 
> cause the Xavgtime confused
> -
>
> Key: HADOOP-14492
> URL: https://issues.apache.org/jira/browse/HADOOP-14492
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: metrics
>Affects Versions: 2.8.0, 2.7.4
>Reporter: Lantao Jin
>Priority: Minor
>
> For performance purpose, 
> [HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
> metrics behaviour in {{RpcDetailedMetrics}}.
> In 2.7.4:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRatesWithAggregation rates;
> {code}
> In old version:
> {code}
> public class RpcDetailedMetrics {
>   @Metric MutableRates rates;
> {code}
> But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
> version:
> {code}
> public class NameNodeMetrics {
>   @Metric("Block report") MutableRate blockReport;
> {code}
> It causes the metrics in JMX very different between them.
> {quote}
> name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
> modelerType: "RpcDetailedActivityForPort8030",
> tag.port: "8030",
> tag.Context: "rpcdetailed",
> ...
> BlockReportNumOps: 237634,
> BlockReportAvgTime: 1382,
> ...
> name: "Hadoop:service=NameNode,name=NameNodeActivity",
> modelerType: "NameNodeActivity",
> tag.ProcessName: "NameNode",
> ...
> BlockReportNumOps: 2592932,
> BlockReportAvgTime: 19.258064516129032,
> ...
> {quote}
> In the old version. They are correct.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-14492) RpcDetailedMetrics and NameNodeMetrics use different rate metrics abstraction cause the Xavgtime confused

2017-06-05 Thread Lantao Jin (JIRA)
Lantao Jin created HADOOP-14492:
---

 Summary: RpcDetailedMetrics and NameNodeMetrics use different rate 
metrics abstraction cause the Xavgtime confused
 Key: HADOOP-14492
 URL: https://issues.apache.org/jira/browse/HADOOP-14492
 Project: Hadoop Common
  Issue Type: Bug
  Components: metrics
Affects Versions: 2.8.0, 2.7.4
Reporter: Lantao Jin
Priority: Minor


For performance purpose, 
[HADOOP-13782|https://issues.apache.org/jira/browse/HADOOP-13782] change the 
metrics behaviour in {{RpcDetailedMetrics}}.
In 2.7.4:
{code}
public class RpcDetailedMetrics {

  @Metric MutableRatesWithAggregation rates;
{code}
In old version:
{code}
public class RpcDetailedMetrics {

  @Metric MutableRates rates;
{code}

But {{NameNodeMetrics}} still use {{MutableRate}} whatever in the new or old 
version:
{code}
public class NameNodeMetrics {
  @Metric("Block report") MutableRate blockReport;
{code}

It causes the metrics in JMX very different between them.
{quote}
{
name: "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8030",
modelerType: "RpcDetailedActivityForPort8030",
tag.port: "8030",
tag.Context: "rpcdetailed",
...
BlockReportNumOps: 237634,
BlockReportAvgTime: 1382,
...
}
{
name: "Hadoop:service=NameNode,name=NameNodeActivity",
modelerType: "NameNodeActivity",
tag.ProcessName: "NameNode",
...
BlockReportNumOps: 2592932,
BlockReportAvgTime: 19.258064516129032,
...
}
{quote}
In the old version. They are correct.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-9383) mvn clean compile fails without install goal

2014-06-17 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9383?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14033541#comment-14033541
 ] 

Lantao Jin commented on HADOOP-9383:


I find it in Ubuntu also, my protocbuf version is 2.5.0

 mvn clean compile fails without install goal
 

 Key: HADOOP-9383
 URL: https://issues.apache.org/jira/browse/HADOOP-9383
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.0.0
Reporter: Arpit Agarwal
 Fix For: 3.0.0


 'mvn -Pnative-win clean compile' fails with the following error:
 [ERROR] Could not find goal 'protoc' in plugin 
 org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT among available goals 
 - [Help 1]
 The build succeeds if the install goal is specified.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (HADOOP-9383) mvn clean compile fails without install goal

2014-06-17 Thread Lantao Jin (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9383?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14033562#comment-14033562
 ] 

Lantao Jin commented on HADOOP-9383:


I find a way to pass compile:

cd hadoop-common-project
mvn clean compile
cd ..
mvn compile

It all pass!

 mvn clean compile fails without install goal
 

 Key: HADOOP-9383
 URL: https://issues.apache.org/jira/browse/HADOOP-9383
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 3.0.0
Reporter: Arpit Agarwal
 Fix For: 3.0.0


 'mvn -Pnative-win clean compile' fails with the following error:
 [ERROR] Could not find goal 'protoc' in plugin 
 org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT among available goals 
 - [Help 1]
 The build succeeds if the install goal is specified.



--
This message was sent by Atlassian JIRA
(v6.2#6252)