[jira] [Updated] (HADOOP-11591) hadoop message for help and version with hyphens confusing

2015-03-24 Thread Sanjeev T (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sanjeev T updated HADOOP-11591:
---
  Resolution: Fixed
Assignee: Sanjeev T
Hadoop Flags: Incompatible change
  Status: Resolved  (was: Patch Available)

The hadoop shell script is been changed, we don't need this patch.

 hadoop message for help and version with hyphens confusing
 --

 Key: HADOOP-11591
 URL: https://issues.apache.org/jira/browse/HADOOP-11591
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 2.6.0
Reporter: Sanjeev T
Assignee: Sanjeev T
Priority: Trivial
 Attachments: HADOOP-11591.patch


 hadoop message, for help and version confuse the user
 * for hadoop help
 {noformat}
 hadoop --help
 Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
   fs   run a generic filesystem user client
   version  print the version
   jar jarrun a jar file
   checknative [-a|-h]  check native hadoop and compression libraries 
 availability
   distcp srcurl desturl copy file or directories recursively
   archive -archiveName NAME -p parent path src* dest create a hadoop 
 archive
   classpathprints the class path needed to get the
Hadoop jar and the required libraries
   daemonlogget/set the log level for each daemon
  or
   CLASSNAMErun the class named CLASSNAME
 Most commands print help when invoked w/o parameters.
 {noformat}
 * for checking the hadoop version
 {noformat}
 $ hadoop --version
 Error: No command named `--version' was found. Perhaps you meant `hadoop 
 -version'
 $ hadoop -version
 Error: No command named `-version' was found. Perhaps you meant `hadoop 
 version'
 $ hadoop version
 Hadoop 2.0.0-cdh4.3.0
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11609) Correct credential commands info in CommandsManual.html#credential

2015-03-24 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377873#comment-14377873
 ] 

Hudson commented on HADOOP-11609:
-

FAILURE: Integrated in Hadoop-Mapreduce-trunk #2092 (See 
[https://builds.apache.org/job/Hadoop-Mapreduce-trunk/2092/])
HADOOP-11609. Correct credential commands info in 
CommandsManual.html#credential. Contributed by Varun Saxena. (ozawa: rev 
6e891a921e00b122390a976dfd13838472a7fcc6)
* hadoop-common-project/hadoop-common/CHANGES.txt
* 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/alias/CredentialShell.java
* hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md


 Correct credential commands info in CommandsManual.html#credential
 --

 Key: HADOOP-11609
 URL: https://issues.apache.org/jira/browse/HADOOP-11609
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation, security
Reporter: Brahma Reddy Battula
Assignee: Varun Saxena
 Fix For: 2.7.0

 Attachments: HADOOP-11609.001.patch, HADOOP-11609.patch


  -i is not supported, so would you remove -i,,, 
 -v should be undocumented. The option is used only by test.
 {noformat}
 create alias [-v value][-provider provider-path]  Prompts the user for a 
 credential to be stored as the given alias when a value is not provided via 
 -v. The hadoop.security.credential.provider.path within the core-site.xml 
 file will be used unless a -provider is indicated.
 delete alias [-i][-provider provider-path]Deletes the credential with the 
 provided alias and optionally warns the user when --interactive is used. The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 list [-provider provider-path]Lists all of the credential aliases The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11602) Fix toUpperCase/toLowerCase to use Locale.ENGLISH

2015-03-24 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11602?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377877#comment-14377877
 ] 

Hudson commented on HADOOP-11602:
-

FAILURE: Integrated in Hadoop-Mapreduce-trunk #2092 (See 
[https://builds.apache.org/job/Hadoop-Mapreduce-trunk/2092/])
Fix CHANGES.txt for HADOOP-11602. (ozawa: rev 
3ca5bd163292e661473017e70b9ca77f5a5b78c0)
* hadoop-common-project/hadoop-common/CHANGES.txt


 Fix toUpperCase/toLowerCase to use Locale.ENGLISH
 -

 Key: HADOOP-11602
 URL: https://issues.apache.org/jira/browse/HADOOP-11602
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Tsuyoshi Ozawa
Assignee: Tsuyoshi Ozawa
Priority: Blocker
 Fix For: 2.7.0

 Attachments: HADOOP-11602-001.patch, HADOOP-11602-002.patch, 
 HADOOP-11602-003.patch, HADOOP-11602-004.patch, 
 HADOOP-11602-branch-2.001.patch, HADOOP-11602-branch-2.002.patch, 
 HADOOP-11602-branch-2.003.patch, HADOOP-11602-branch-2.004.patch, 
 HADOOP-11602-branch-2.005.patch


 String#toLowerCase()/toUpperCase() without a locale argument can occur 
 unexpected behavior based on the locale. It's written in 
 [Javadoc|http://docs.oracle.com/javase/7/docs/api/java/lang/String.html#toLowerCase()]:
 {quote}
 For instance, TITLE.toLowerCase() in a Turkish locale returns t\u0131tle, 
 where '\u0131' is the LATIN SMALL LETTER DOTLESS I character
 {quote}
 This issue is derived from HADOOP-10101.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11591) hadoop message for help and version with hyphens confusing

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=1437#comment-1437
 ] 

Tsuyoshi Ozawa commented on HADOOP-11591:
-

[~sanjeevtripurari] OK, thank you for reporting!

 hadoop message for help and version with hyphens confusing
 --

 Key: HADOOP-11591
 URL: https://issues.apache.org/jira/browse/HADOOP-11591
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 2.6.0
Reporter: Sanjeev T
Assignee: Sanjeev T
Priority: Trivial
 Attachments: HADOOP-11591.patch


 hadoop message, for help and version confuse the user
 * for hadoop help
 {noformat}
 hadoop --help
 Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
   fs   run a generic filesystem user client
   version  print the version
   jar jarrun a jar file
   checknative [-a|-h]  check native hadoop and compression libraries 
 availability
   distcp srcurl desturl copy file or directories recursively
   archive -archiveName NAME -p parent path src* dest create a hadoop 
 archive
   classpathprints the class path needed to get the
Hadoop jar and the required libraries
   daemonlogget/set the log level for each daemon
  or
   CLASSNAMErun the class named CLASSNAME
 Most commands print help when invoked w/o parameters.
 {noformat}
 * for checking the hadoop version
 {noformat}
 $ hadoop --version
 Error: No command named `--version' was found. Perhaps you meant `hadoop 
 -version'
 $ hadoop -version
 Error: No command named `-version' was found. Perhaps you meant `hadoop 
 version'
 $ hadoop version
 Hadoop 2.0.0-cdh4.3.0
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11609) Correct credential commands info in CommandsManual.html#credential

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11609?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tsuyoshi Ozawa updated HADOOP-11609:

   Resolution: Fixed
Fix Version/s: 2.7.0
 Hadoop Flags: Reviewed
   Status: Resolved  (was: Patch Available)

Committed this to trunk, branch-2, and branch-2.7. Thanks [~varun_saxena] for 
your contribution and thanks [~ajisakaa] for your review, and thanks 
[~brahmareddy] for your reporting.

 Correct credential commands info in CommandsManual.html#credential
 --

 Key: HADOOP-11609
 URL: https://issues.apache.org/jira/browse/HADOOP-11609
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation, security
Reporter: Brahma Reddy Battula
Assignee: Varun Saxena
 Fix For: 2.7.0

 Attachments: HADOOP-11609.001.patch, HADOOP-11609.patch


  -i is not supported, so would you remove -i,,, 
 -v should be undocumented. The option is used only by test.
 {noformat}
 create alias [-v value][-provider provider-path]  Prompts the user for a 
 credential to be stored as the given alias when a value is not provided via 
 -v. The hadoop.security.credential.provider.path within the core-site.xml 
 file will be used unless a -provider is indicated.
 delete alias [-i][-provider provider-path]Deletes the credential with the 
 provided alias and optionally warns the user when --interactive is used. The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 list [-provider provider-path]Lists all of the credential aliases The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11609) Correct credential commands info in CommandsManual.html#credential

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377750#comment-14377750
 ] 

Tsuyoshi Ozawa commented on HADOOP-11609:
-

+1, committing this shortly.

 Correct credential commands info in CommandsManual.html#credential
 --

 Key: HADOOP-11609
 URL: https://issues.apache.org/jira/browse/HADOOP-11609
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation, security
Reporter: Brahma Reddy Battula
Assignee: Varun Saxena
 Attachments: HADOOP-11609.001.patch, HADOOP-11609.patch


  -i is not supported, so would you remove -i,,, 
 -v should be undocumented. The option is used only by test.
 {noformat}
 create alias [-v value][-provider provider-path]  Prompts the user for a 
 credential to be stored as the given alias when a value is not provided via 
 -v. The hadoop.security.credential.provider.path within the core-site.xml 
 file will be used unless a -provider is indicated.
 delete alias [-i][-provider provider-path]Deletes the credential with the 
 provided alias and optionally warns the user when --interactive is used. The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 list [-provider provider-path]Lists all of the credential aliases The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11602) Fix toUpperCase/toLowerCase to use Locale.ENGLISH

2015-03-24 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11602?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377766#comment-14377766
 ] 

Hudson commented on HADOOP-11602:
-

SUCCESS: Integrated in Hadoop-trunk-Commit #7416 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/7416/])
Fix CHANGES.txt for HADOOP-11602. (ozawa: rev 
3ca5bd163292e661473017e70b9ca77f5a5b78c0)
* hadoop-common-project/hadoop-common/CHANGES.txt


 Fix toUpperCase/toLowerCase to use Locale.ENGLISH
 -

 Key: HADOOP-11602
 URL: https://issues.apache.org/jira/browse/HADOOP-11602
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Tsuyoshi Ozawa
Assignee: Tsuyoshi Ozawa
Priority: Blocker
 Fix For: 2.7.0

 Attachments: HADOOP-11602-001.patch, HADOOP-11602-002.patch, 
 HADOOP-11602-003.patch, HADOOP-11602-004.patch, 
 HADOOP-11602-branch-2.001.patch, HADOOP-11602-branch-2.002.patch, 
 HADOOP-11602-branch-2.003.patch, HADOOP-11602-branch-2.004.patch, 
 HADOOP-11602-branch-2.005.patch


 String#toLowerCase()/toUpperCase() without a locale argument can occur 
 unexpected behavior based on the locale. It's written in 
 [Javadoc|http://docs.oracle.com/javase/7/docs/api/java/lang/String.html#toLowerCase()]:
 {quote}
 For instance, TITLE.toLowerCase() in a Turkish locale returns t\u0131tle, 
 where '\u0131' is the LATIN SMALL LETTER DOTLESS I character
 {quote}
 This issue is derived from HADOOP-10101.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11591) hadoop message for help and version with hyphens confusing

2015-03-24 Thread Sanjeev T (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11591?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377705#comment-14377705
 ] 

Sanjeev T commented on HADOOP-11591:


[~ozawa], I check the trunk, the code is changed, we can close this issue.

 hadoop message for help and version with hyphens confusing
 --

 Key: HADOOP-11591
 URL: https://issues.apache.org/jira/browse/HADOOP-11591
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 2.6.0
Reporter: Sanjeev T
Priority: Trivial
 Attachments: HADOOP-11591.patch


 hadoop message, for help and version confuse the user
 * for hadoop help
 {noformat}
 hadoop --help
 Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
   fs   run a generic filesystem user client
   version  print the version
   jar jarrun a jar file
   checknative [-a|-h]  check native hadoop and compression libraries 
 availability
   distcp srcurl desturl copy file or directories recursively
   archive -archiveName NAME -p parent path src* dest create a hadoop 
 archive
   classpathprints the class path needed to get the
Hadoop jar and the required libraries
   daemonlogget/set the log level for each daemon
  or
   CLASSNAMErun the class named CLASSNAME
 Most commands print help when invoked w/o parameters.
 {noformat}
 * for checking the hadoop version
 {noformat}
 $ hadoop --version
 Error: No command named `--version' was found. Perhaps you meant `hadoop 
 -version'
 $ hadoop -version
 Error: No command named `-version' was found. Perhaps you meant `hadoop 
 version'
 $ hadoop version
 Hadoop 2.0.0-cdh4.3.0
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11609) Correct credential commands info in CommandsManual.html#credential

2015-03-24 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377757#comment-14377757
 ] 

Hudson commented on HADOOP-11609:
-

FAILURE: Integrated in Hadoop-trunk-Commit #7415 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/7415/])
HADOOP-11609. Correct credential commands info in 
CommandsManual.html#credential. Contributed by Varun Saxena. (ozawa: rev 
6e891a921e00b122390a976dfd13838472a7fcc6)
* 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/alias/CredentialShell.java
* hadoop-common-project/hadoop-common/CHANGES.txt
* hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md


 Correct credential commands info in CommandsManual.html#credential
 --

 Key: HADOOP-11609
 URL: https://issues.apache.org/jira/browse/HADOOP-11609
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation, security
Reporter: Brahma Reddy Battula
Assignee: Varun Saxena
 Fix For: 2.7.0

 Attachments: HADOOP-11609.001.patch, HADOOP-11609.patch


  -i is not supported, so would you remove -i,,, 
 -v should be undocumented. The option is used only by test.
 {noformat}
 create alias [-v value][-provider provider-path]  Prompts the user for a 
 credential to be stored as the given alias when a value is not provided via 
 -v. The hadoop.security.credential.provider.path within the core-site.xml 
 file will be used unless a -provider is indicated.
 delete alias [-i][-provider provider-path]Deletes the credential with the 
 provided alias and optionally warns the user when --interactive is used. The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 list [-provider provider-path]Lists all of the credential aliases The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11406) xargs -P is not portable

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378699#comment-14378699
 ] 

Allen Wittenauer commented on HADOOP-11406:
---

So I've been thinking about this the past few days.  Moving from xargs to a for 
loop means a pretty significant drop in performance.  I'm sort of leaning 
towards keeping using the portable as the default and putting the paralell 
xargs one as an example in the hadoop-user-functions.sh.  One thing that should 
probably be done in order to facilitate this is to move the actual loop into 
another function so that it makes it easy to replace.

Thoughts?

 xargs -P is not portable
 

 Key: HADOOP-11406
 URL: https://issues.apache.org/jira/browse/HADOOP-11406
 Project: Hadoop Common
  Issue Type: Bug
  Components: scripts
Affects Versions: 3.0.0
 Environment: Solaris
 Illumos
 AIX
 ... likely others
Reporter: Allen Wittenauer
Assignee: Kengo Seki
Priority: Critical
 Attachments: HADOOP-11406.001.patch


 hadoop-functions.sh uses xargs -P in the ssh handler.  -P is a GNU extension 
 and is not available on all operating systems.  We should add some detection 
 for support and perform an appropriate action.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11406) xargs -P is not portable

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11406:
--
Assignee: Kengo Seki  (was: Brahma Reddy Battula)

 xargs -P is not portable
 

 Key: HADOOP-11406
 URL: https://issues.apache.org/jira/browse/HADOOP-11406
 Project: Hadoop Common
  Issue Type: Bug
  Components: scripts
Affects Versions: 3.0.0
 Environment: Solaris
 Illumos
 AIX
 ... likely others
Reporter: Allen Wittenauer
Assignee: Kengo Seki
Priority: Critical
 Attachments: HADOOP-11406.001.patch


 hadoop-functions.sh uses xargs -P in the ssh handler.  -P is a GNU extension 
 and is not available on all operating systems.  We should add some detection 
 for support and perform an appropriate action.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11524) hadoop_do_classpath_subcommand throws a shellcheck warning

2015-03-24 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378835#comment-14378835
 ] 

Hadoop QA commented on HADOOP-11524:


{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  
http://issues.apache.org/jira/secure/attachment/12706969/HADOOP-11524.001.patch
  against trunk revision a16bfff.

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
Please justify why no new tests are needed for this 
patch.
Also please list what manual steps were performed to 
verify this patch.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  There were no new javadoc warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 2.0.3) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:red}-1 core tests{color}.  The patch failed these unit tests in 
hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs:

  org.apache.hadoop.hdfs.server.balancer.TestBalancer
  org.apache.hadoop.hdfs.server.namenode.TestFsck
  org.apache.hadoop.tracing.TestTracing

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5988//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5988//console

This message is automatically generated.

 hadoop_do_classpath_subcommand throws a shellcheck warning
 --

 Key: HADOOP-11524
 URL: https://issues.apache.org/jira/browse/HADOOP-11524
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Chris Nauroth
Priority: Minor
 Attachments: HADOOP-11524.001.patch


 {code}
 CLASS=org.apache.hadoop.util.Classpath
 ^-- SC2034: CLASS appears unused. Verify it or export it.
 {code}
 We should probably use a local var here and return it or something, even 
 though CLASS is technically a global.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-9642) Configuration to resolve environment variables via ${env.VARIABLE} references

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-9642?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378721#comment-14378721
 ] 

Allen Wittenauer commented on HADOOP-9642:
--

Linking HADOOP-11553. If both get committed, the docs should reference this 
functionality.

 Configuration to resolve environment variables via ${env.VARIABLE} references
 -

 Key: HADOOP-9642
 URL: https://issues.apache.org/jira/browse/HADOOP-9642
 Project: Hadoop Common
  Issue Type: Improvement
  Components: conf, scripts
Affects Versions: 3.0.0, 2.1.0-beta
Reporter: Steve Loughran
Assignee: Kengo Seki
Priority: Minor
 Attachments: HADOOP-9642.001.patch, HADOOP-9642.002.patch


 We should be able to get env variables from Configuration files, rather than 
 just system properties. I propose using the traditional {{env}} prefix 
 {{${env.PATH}}} to make it immediately clear to people reading a conf file 
 that it's an env variable -and to avoid any confusion with system properties 
 and existing configuration properties.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378719#comment-14378719
 ] 

Allen Wittenauer commented on HADOOP-11553:
---

Linking HADOOP-9642. If both get committed, the docs should reference that 
functionality.

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch, 
 HADOOP-11553-05.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11745) Incorporate shellcheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Chris Nauroth (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378767#comment-14378767
 ] 

Chris Nauroth commented on HADOOP-11745:


Here is information on ShellCheck:

http://www.shellcheck.net/about.html

We'd need to arrange for this to be deployed to all of the Jenkins hosts and 
then modify test-patch.sh to call it.

cc [~aw]

 Incorporate shellcheck static analysis into Jenkins pre-commit builds.
 --

 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor

 During the shell script rewrite on trunk, we've been using ShellCheck as a 
 static analysis tool to catch common errors.  We can incorporate this 
 directly into Jenkins pre-commit builds.  Jenkins can reply with a -1 on 
 shell script patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11745) Incorporate shellcheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Chris Nauroth (JIRA)
Chris Nauroth created HADOOP-11745:
--

 Summary: Incorporate shellcheck static analysis into Jenkins 
pre-commit builds.
 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor


During the shell script rewrite on trunk, we've been using ShellCheck as a 
static analysis tool to catch common errors.  We can incorporate this directly 
into Jenkins pre-commit builds.  Jenkins can reply with a -1 on shell script 
patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11745) Incorporate ShellCheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Chris Nauroth (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11745?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chris Nauroth updated HADOOP-11745:
---
Summary: Incorporate ShellCheck static analysis into Jenkins pre-commit 
builds.  (was: Incorporate shellcheck static analysis into Jenkins pre-commit 
builds.)

 Incorporate ShellCheck static analysis into Jenkins pre-commit builds.
 --

 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor

 During the shell script rewrite on trunk, we've been using ShellCheck as a 
 static analysis tool to catch common errors.  We can incorporate this 
 directly into Jenkins pre-commit builds.  Jenkins can reply with a -1 on 
 shell script patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10670) Allow AuthenticationFilter to respect signature secret file even without AuthenticationFilterInitializer

2015-03-24 Thread Robert Kanter (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378850#comment-14378850
 ] 

Robert Kanter commented on HADOOP-10670:


It's not really a side effect of HADOOP-10868; it's a side effect of the 
original implementation, which simply loaded the secret from a config property, 
or used a random one if not set.  HADOOP-10791 added support for pluggable 
providers (to allow HADOOP-1868 to work), and included 
{{StringSignerSecretProvider}} to be backwards compatible with that setting.  

While I agree that {{FileSignerSecretProvider}} is more secure, I'm not sure we 
can simply remove {{StringSignerSecretProvider}} without breaking 
compatibility.  What if we instead deprecate it, log a warning about it not 
being recommended, and add a note to the docs?  

 Allow AuthenticationFilter to respect signature secret file even without 
 AuthenticationFilterInitializer
 

 Key: HADOOP-10670
 URL: https://issues.apache.org/jira/browse/HADOOP-10670
 Project: Hadoop Common
  Issue Type: Improvement
  Components: security
Reporter: Kai Zheng
Assignee: Kai Zheng
Priority: Minor
 Attachments: HADOOP-10670-v4.patch, HADOOP-10670-v5.patch, 
 hadoop-10670-v2.patch, hadoop-10670-v3.patch, hadoop-10670.patch


 In Hadoop web console, by using AuthenticationFilterInitializer, it's allowed 
 to configure AuthenticationFilter for the required signature secret by 
 specifying signature.secret.file property. This improvement would also allow 
 this when AuthenticationFilterInitializer isn't used in situations like 
 webhdfs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378889#comment-14378889
 ] 

Hadoop QA commented on HADOOP-11553:


{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12707011/HADOOP-11553-05.patch
  against trunk revision a16bfff.

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+0 tests included{color}.  The patch appears to be a 
documentation patch that doesn't require tests.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  There were no new javadoc warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 2.0.3) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-common-project/hadoop-common.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5990//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5990//console

This message is automatically generated.

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch, 
 HADOOP-11553-05.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11524) hadoop_do_classpath_subcommand throws a shellcheck warning

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378671#comment-14378671
 ] 

Allen Wittenauer commented on HADOOP-11524:
---

+1 lgtm. Probably a better API this way too.

 hadoop_do_classpath_subcommand throws a shellcheck warning
 --

 Key: HADOOP-11524
 URL: https://issues.apache.org/jira/browse/HADOOP-11524
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Chris Nauroth
Priority: Minor
 Attachments: HADOOP-11524.001.patch


 {code}
 CLASS=org.apache.hadoop.util.Classpath
 ^-- SC2034: CLASS appears unused. Verify it or export it.
 {code}
 We should probably use a local var here and return it or something, even 
 though CLASS is technically a global.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11745) Incorporate ShellCheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378933#comment-14378933
 ] 

Allen Wittenauer commented on HADOOP-11745:
---

Ha! I've been thinking more and more about HADOOP-10854 and what kind of things 
should be done for it.  This particular issue (shellcheck part of test-patch) 
was going to be something of immediate priority on that list!

So I'm glad to see this here. :D

One of the things that I've been puzzling over was the best way to mark thing 
that we know are unfixable (HADOOP_OPTS handling in hadoop-functions.sh, for 
example).  I've been meaning to look at how exceptions are handled for some of 
the Java-bits but I hadn't gotten that far yet.

One of the big gotchas with putting this into the pipeline is we need to audit 
*ALL* of the shell scripts to make sure they pass.  At this point, I think/hope 
the only things left to really look at is dev-support itself.  

 Incorporate ShellCheck static analysis into Jenkins pre-commit builds.
 --

 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor

 During the shell script rewrite on trunk, we've been using ShellCheck as a 
 static analysis tool to catch common errors.  We can incorporate this 
 directly into Jenkins pre-commit builds.  Jenkins can reply with a -1 on 
 shell script patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10670) Allow AuthenticationFilter to respect signature secret file even without AuthenticationFilterInitializer

2015-03-24 Thread Haohui Mai (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378803#comment-14378803
 ] 

Haohui Mai commented on HADOOP-10670:
-

The approach looks good.

{code}
+  // The precedence from high to low : file, inline string, random
+  if (signatureSecretFile != null) {
+providerClassName = FileSignerSecretProvider.class.getName();
{code}

I think the way the code works is a side effect on HADOOP-10868. We do not 
support inlining the secret in the configuration. Anyone can read the 
configuration can forge the authentication cookie. This is a security 
vulnerability since the Hadoop configuration is readable by both servers and 
clients. We have similar issues in NFS / LDAP and we store the secret / 
credentials in a separate file and guard them by setting the permissions 
properly.

We should remove {{StringSecretProvider}} once we have {{FileSecretProvider}}. 
[~rkanter], can you comment on this?

I think the patch also needs to remove the duplicated code 
{{RMAuthenticationFilterInitializer}} as well.

 Allow AuthenticationFilter to respect signature secret file even without 
 AuthenticationFilterInitializer
 

 Key: HADOOP-10670
 URL: https://issues.apache.org/jira/browse/HADOOP-10670
 Project: Hadoop Common
  Issue Type: Improvement
  Components: security
Reporter: Kai Zheng
Assignee: Kai Zheng
Priority: Minor
 Attachments: HADOOP-10670-v4.patch, HADOOP-10670-v5.patch, 
 hadoop-10670-v2.patch, hadoop-10670-v3.patch, hadoop-10670.patch


 In Hadoop web console, by using AuthenticationFilterInitializer, it's allowed 
 to configure AuthenticationFilter for the required signature secret by 
 specifying signature.secret.file property. This improvement would also allow 
 this when AuthenticationFilterInitializer isn't used in situations like 
 webhdfs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11602) Fix toUpperCase/toLowerCase to use Locale.ENGLISH

2015-03-24 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11602?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378008#comment-14378008
 ] 

Hudson commented on HADOOP-11602:
-

FAILURE: Integrated in Hadoop-Mapreduce-trunk-Java8 #142 (See 
[https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/142/])
Fix CHANGES.txt for HADOOP-11602. (ozawa: rev 
3ca5bd163292e661473017e70b9ca77f5a5b78c0)
* hadoop-common-project/hadoop-common/CHANGES.txt


 Fix toUpperCase/toLowerCase to use Locale.ENGLISH
 -

 Key: HADOOP-11602
 URL: https://issues.apache.org/jira/browse/HADOOP-11602
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Tsuyoshi Ozawa
Assignee: Tsuyoshi Ozawa
Priority: Blocker
 Fix For: 2.7.0

 Attachments: HADOOP-11602-001.patch, HADOOP-11602-002.patch, 
 HADOOP-11602-003.patch, HADOOP-11602-004.patch, 
 HADOOP-11602-branch-2.001.patch, HADOOP-11602-branch-2.002.patch, 
 HADOOP-11602-branch-2.003.patch, HADOOP-11602-branch-2.004.patch, 
 HADOOP-11602-branch-2.005.patch


 String#toLowerCase()/toUpperCase() without a locale argument can occur 
 unexpected behavior based on the locale. It's written in 
 [Javadoc|http://docs.oracle.com/javase/7/docs/api/java/lang/String.html#toLowerCase()]:
 {quote}
 For instance, TITLE.toLowerCase() in a Turkish locale returns t\u0131tle, 
 where '\u0131' is the LATIN SMALL LETTER DOTLESS I character
 {quote}
 This issue is derived from HADOOP-10101.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11609) Correct credential commands info in CommandsManual.html#credential

2015-03-24 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378004#comment-14378004
 ] 

Hudson commented on HADOOP-11609:
-

FAILURE: Integrated in Hadoop-Mapreduce-trunk-Java8 #142 (See 
[https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/142/])
HADOOP-11609. Correct credential commands info in 
CommandsManual.html#credential. Contributed by Varun Saxena. (ozawa: rev 
6e891a921e00b122390a976dfd13838472a7fcc6)
* hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md
* hadoop-common-project/hadoop-common/CHANGES.txt
* 
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/alias/CredentialShell.java


 Correct credential commands info in CommandsManual.html#credential
 --

 Key: HADOOP-11609
 URL: https://issues.apache.org/jira/browse/HADOOP-11609
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation, security
Reporter: Brahma Reddy Battula
Assignee: Varun Saxena
 Fix For: 2.7.0

 Attachments: HADOOP-11609.001.patch, HADOOP-11609.patch


  -i is not supported, so would you remove -i,,, 
 -v should be undocumented. The option is used only by test.
 {noformat}
 create alias [-v value][-provider provider-path]  Prompts the user for a 
 credential to be stored as the given alias when a value is not provided via 
 -v. The hadoop.security.credential.provider.path within the core-site.xml 
 file will be used unless a -provider is indicated.
 delete alias [-i][-provider provider-path]Deletes the credential with the 
 provided alias and optionally warns the user when --interactive is used. The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 list [-provider provider-path]Lists all of the credential aliases The 
 hadoop.security.credential.provider.path within the core-site.xml file will 
 be used unless a -provider is indicated.
 {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11745) Incorporate ShellCheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Chris Nauroth (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378985#comment-14378985
 ] 

Chris Nauroth commented on HADOOP-11745:


bq. I've been meaning to look at how exceptions are handled for some of the 
Java-bits but I hadn't gotten that far yet.

I think the closest analogy in the Java world is our Findbugs exclude files.  
These are XML files that describe points in the code that trigger Findbugs 
warnings that we have agreed are spurious.  I think the {{shellcheck disable}} 
comments are similar.  I think if we keep using that, then it would work out 
fine.

bq. At this point, I think/hope the only things left to really look at is 
dev-support itself.

Yes, that's a good point.  I do see a lot of warnings in the dev-support 
scripts.  If this gets intractable, then I suppose as a first step we could 
chicken out and filter dev-support from the ShellCheck results.  :-)

 Incorporate ShellCheck static analysis into Jenkins pre-commit builds.
 --

 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor

 During the shell script rewrite on trunk, we've been using ShellCheck as a 
 static analysis tool to catch common errors.  We can incorporate this 
 directly into Jenkins pre-commit builds.  Jenkins can reply with a -1 on 
 shell script patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11745) Incorporate ShellCheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379230#comment-14379230
 ] 

Allen Wittenauer commented on HADOOP-11745:
---

I've opened HADOOP-11746 to take on test-patch.sh.  For now, we should plan on 
escaping out dev-support, but this code is *terrible*.

 Incorporate ShellCheck static analysis into Jenkins pre-commit builds.
 --

 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor

 During the shell script rewrite on trunk, we've been using ShellCheck as a 
 static analysis tool to catch common errors.  We can incorporate this 
 directly into Jenkins pre-commit builds.  Jenkins can reply with a -1 on 
 shell script patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Assigned] (HADOOP-11746) rewrite test-patch.sh

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer reassigned HADOOP-11746:
-

Assignee: Allen Wittenauer

 rewrite test-patch.sh
 -

 Key: HADOOP-11746
 URL: https://issues.apache.org/jira/browse/HADOOP-11746
 Project: Hadoop Common
  Issue Type: Bug
  Components: build, test
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer

 This code is bad and you should feel bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11746) rewrite test-patch.sh

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11746:
--
Issue Type: Test  (was: Bug)

 rewrite test-patch.sh
 -

 Key: HADOOP-11746
 URL: https://issues.apache.org/jira/browse/HADOOP-11746
 Project: Hadoop Common
  Issue Type: Test
  Components: build, test
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer

 This code is bad and you should feel bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11746) rewrite test-patch.sh

2015-03-24 Thread Allen Wittenauer (JIRA)
Allen Wittenauer created HADOOP-11746:
-

 Summary: rewrite test-patch.sh
 Key: HADOOP-11746
 URL: https://issues.apache.org/jira/browse/HADOOP-11746
 Project: Hadoop Common
  Issue Type: Bug
  Components: build, test
Affects Versions: 3.0.0
Reporter: Allen Wittenauer


This code is bad and you should feel bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11741) Add LOG.isDebugEnabled() guard for some LOG.debug(..)

2015-03-24 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379229#comment-14379229
 ] 

Hadoop QA commented on HADOOP-11741:


{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  
http://issues.apache.org/jira/secure/attachment/12707112/HADOOP-11741.002.patch
  against trunk revision 53a28af.

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
Please justify why no new tests are needed for this 
patch.
Also please list what manual steps were performed to 
verify this patch.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  There were no new javadoc warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 2.0.3) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:red}-1 core tests{color}.  The patch failed these unit tests in 
hadoop-common-project/hadoop-common:

  org.apache.hadoop.ipc.TestRPCWaitForProxy

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5991//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5991//console

This message is automatically generated.

 Add LOG.isDebugEnabled() guard for some LOG.debug(..)
 -

 Key: HADOOP-11741
 URL: https://issues.apache.org/jira/browse/HADOOP-11741
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Walter Su
Assignee: Walter Su
 Attachments: HADOOP-11741.001.patch, HADOOP-11741.002.patch


 {{isDebugEnabled()}} is optional. But when there are :
 1. lots of concatenating Strings
 2. complicated function calls
 in the arguments, {{LOG.debug(..)}} should be guarded with 
 {{LOG.isDebugEnabled()}} to avoid unnecessary arguments evaluation and impove 
 performance.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11745) Incorporate ShellCheck static analysis into Jenkins pre-commit builds.

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11745?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379046#comment-14379046
 ] 

Allen Wittenauer commented on HADOOP-11745:
---

Looking at this now, shellcheck does support spitting out something that is 
checkstyle-compatible, but I'm not sure how to integrate that into our build 
setup.

 Someone asked me the other day if I was going to fix dev-support too since 
technically it is mostly shell code. I guess I'll probably take a look at the 
dev-support/ stuff and start whacking at it. 

 Incorporate ShellCheck static analysis into Jenkins pre-commit builds.
 --

 Key: HADOOP-11745
 URL: https://issues.apache.org/jira/browse/HADOOP-11745
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build, scripts
Reporter: Chris Nauroth
Priority: Minor

 During the shell script rewrite on trunk, we've been using ShellCheck as a 
 static analysis tool to catch common errors.  We can incorporate this 
 directly into Jenkins pre-commit builds.  Jenkins can reply with a -1 on 
 shell script patches that introduce new ShellCheck warnings.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-10670) Allow AuthenticationFilter to respect signature secret file even without AuthenticationFilterInitializer

2015-03-24 Thread Kai Zheng (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379103#comment-14379103
 ] 

Kai Zheng commented on HADOOP-10670:


Thanks [~wheat9] and [~rkanter] for the review and discussion. I will update 
accordingly.

 Allow AuthenticationFilter to respect signature secret file even without 
 AuthenticationFilterInitializer
 

 Key: HADOOP-10670
 URL: https://issues.apache.org/jira/browse/HADOOP-10670
 Project: Hadoop Common
  Issue Type: Improvement
  Components: security
Reporter: Kai Zheng
Assignee: Kai Zheng
Priority: Minor
 Attachments: HADOOP-10670-v4.patch, HADOOP-10670-v5.patch, 
 hadoop-10670-v2.patch, hadoop-10670-v3.patch, hadoop-10670.patch


 In Hadoop web console, by using AuthenticationFilterInitializer, it's allowed 
 to configure AuthenticationFilter for the required signature secret by 
 specifying signature.secret.file property. This improvement would also allow 
 this when AuthenticationFilterInitializer isn't used in situations like 
 webhdfs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11741) Add LOG.isDebugEnabled() guard for some LOG.debug(..)

2015-03-24 Thread Walter Su (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Walter Su updated HADOOP-11741:
---
Attachment: HADOOP-11741.002.patch

 Add LOG.isDebugEnabled() guard for some LOG.debug(..)
 -

 Key: HADOOP-11741
 URL: https://issues.apache.org/jira/browse/HADOOP-11741
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Walter Su
Assignee: Walter Su
 Attachments: HADOOP-11741.001.patch, HADOOP-11741.002.patch


 {{isDebugEnabled()}} is optional. But when there are :
 1. lots of concatenating Strings
 2. complicated function calls
 in the arguments, {{LOG.debug(..)}} should be guarded with 
 {{LOG.isDebugEnabled()}} to avoid unnecessary arguments evaluation and impove 
 performance.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11746) rewrite test-patch.sh

2015-03-24 Thread Giridharan Kesavan (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379244#comment-14379244
 ] 

Giridharan Kesavan commented on HADOOP-11746:
-

[~aw] are you planning to re-write the test-patch.sh in python? 
I'm interested in helping in any means, writing the python code or testing it 
out. please let me know. 


 rewrite test-patch.sh
 -

 Key: HADOOP-11746
 URL: https://issues.apache.org/jira/browse/HADOOP-11746
 Project: Hadoop Common
  Issue Type: Test
  Components: build, test
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer

 This code is bad and you should feel bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11724) DistCp throws NPE when the target directory is root.

2015-03-24 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379325#comment-14379325
 ] 

Hadoop QA commented on HADOOP-11724:


{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  
http://issues.apache.org/jira/secure/attachment/12707138/HADOOP-11724.001.patch
  against trunk revision 53a28af.

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
Please justify why no new tests are needed for this 
patch.
Also please list what manual steps were performed to 
verify this patch.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  There were no new javadoc warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 2.0.3) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-tools/hadoop-distcp.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5992//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5992//console

This message is automatically generated.

 DistCp throws NPE when the target directory is root.
 

 Key: HADOOP-11724
 URL: https://issues.apache.org/jira/browse/HADOOP-11724
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Lei (Eddy) Xu
Assignee: Lei (Eddy) Xu
Priority: Minor
 Attachments: HADOOP-11724.000.patch, HADOOP-11724.001.patch


 Distcp throws NPE when the target directory is root. It is due to 
 {{CopyCommitter#cleanupTempFiles}} attempts to delete parent directory of 
 root, which is {{null}}:
 {code}
 $ hadoop distcp pom.xml hdfs://localhost/
 15/03/17 11:17:44 WARN util.NativeCodeLoader: Unable to load native-hadoop 
 library for your platform... using builtin-java classes where applicable
 15/03/17 11:17:45 INFO tools.DistCp: Input Options: 
 DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, 
 ignoreFailures=false, maxMaps=20, sslConfigurationFile='null', 
 copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[pom.xml], 
 targetPath=hdfs://localhost/, targetPathExists=true, preserveRawXattrs=false}
 15/03/17 11:17:45 INFO Configuration.deprecation: session.id is deprecated. 
 Instead, use dfs.metrics.session-id
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with 
 processName=JobTracker, sessionId=
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.mb is deprecated. 
 Instead, use mapreduce.task.io.sort.mb
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.factor is 
 deprecated. Instead, use mapreduce.task.io.sort.factor
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with 
 processName=JobTracker, sessionId= - already initialized
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: number of splits:1
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
 job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: The url to track the job: 
 http://localhost:8080/
 15/03/17 11:17:46 INFO tools.DistCp: DistCp job-id: job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: Running job: job_local992233322_0001
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter set in config 
 null
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter is 
 org.apache.hadoop.tools.mapred.CopyCommitter
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Waiting for map tasks
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Starting task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree 
 currently is supported only on Linux.
 15/03/17 11:17:46 INFO mapred.Task:  Using ResourceCalculatorProcessTree : 
 null
 15/03/17 11:17:46 INFO mapred.MapTask: Processing split: 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/fileList.seq:0+220
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.CopyMapper: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml 

[jira] [Commented] (HADOOP-11746) rewrite test-patch.sh

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379301#comment-14379301
 ] 

Allen Wittenauer commented on HADOOP-11746:
---

I'm going to go for bash for now, simply because there would be a ton of shell 
outs if we did it in python to the point that half the code would be bash 
anyway.  I *might* write some helper code in python to replace the wget though. 
 That would be significantly better to fetch from JIRA via REST.

As it is, just running shellcheck on the existing code has found quite a few 
subtle bugs. :(

 rewrite test-patch.sh
 -

 Key: HADOOP-11746
 URL: https://issues.apache.org/jira/browse/HADOOP-11746
 Project: Hadoop Common
  Issue Type: Test
  Components: build, test
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer

 This code is bad and you should feel bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11746) rewrite test-patch.sh

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11746?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11746:
--
Attachment: HADOOP-11746-00.patch

-00:
* initial pass that just fixes the vast majority of shellcheck errors

 rewrite test-patch.sh
 -

 Key: HADOOP-11746
 URL: https://issues.apache.org/jira/browse/HADOOP-11746
 Project: Hadoop Common
  Issue Type: Test
  Components: build, test
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
 Attachments: HADOOP-11746-00.patch


 This code is bad and you should feel bad.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11733) mvn clean does not clean /tmp

2015-03-24 Thread Akira AJISAKA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11733?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379342#comment-14379342
 ] 

Akira AJISAKA commented on HADOOP-11733:


bq. +1 for tracking down the individual tests that write to /tmp and changing 
them to write to the directory defined by the {{test.build.dir}} system 
property.
Make sense to me but I'm thinking the cause of this issue is different.
* For hadoop-root, property hadoop.tmp.dir is set to /tmp/hadoop-${user.name} 
by default, so some files are created in /tmp. We should override the 
configuration in tests.
* For hsperfdata_root, I'm thinking it's a java issue. You can change 
{{java.io.tmpdir}} to the directory different from /tmp to avoid the issue.
* For Jetty and hdfs-nfs, we should change some configurations to avoid writing 
to /tmp. Now I'm not sure what configuration to change.

 mvn clean does not clean /tmp
 -

 Key: HADOOP-11733
 URL: https://issues.apache.org/jira/browse/HADOOP-11733
 Project: Hadoop Common
  Issue Type: Bug
  Components: build, test
Affects Versions: 2.6.0
 Environment: All
Reporter: Tony Reix
Assignee: Akira AJISAKA
Priority: Minor

 When Hadoop tests are run, files and directories are created in /tmp .
 Many (or all) of them are not cleaned when running mvn clean.
 This generates an issue when tests were previously run (by mistake) as root 
 and then run again as a non-root user, since directories (like: .hdfs-nfs 
 hadoop-root hsperfdata_root) cannot be written by second test run.
 List of files/directories that seem to be created in /tmp when testing Hadoop:
Jetty* *test *deferred .hdfs-nfs hadoop-root hsperfdata_root
 and that should be cleaned after tests.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11724) DistCp throws NPE when the target directory is root.

2015-03-24 Thread Yongjun Zhang (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379266#comment-14379266
 ] 

Yongjun Zhang commented on HADOOP-11724:


Hi [~eddyxu],

Thanks for reporting the issue and the patch. I reviewed it, it seems that a 
more appropriate fix is to handle null parent correctly in {{ 
CopyCommitter#deleteTempFiles}}, rather than skipping the call to it when 
parent is null? Thanks.





 DistCp throws NPE when the target directory is root.
 

 Key: HADOOP-11724
 URL: https://issues.apache.org/jira/browse/HADOOP-11724
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Lei (Eddy) Xu
Assignee: Lei (Eddy) Xu
Priority: Minor
 Attachments: HADOOP-11724.000.patch


 Distcp throws NPE when the target directory is root. It is due to 
 {{CopyCommitter#cleanupTempFiles}} attempts to delete parent directory of 
 root, which is {{null}}:
 {code}
 $ hadoop distcp pom.xml hdfs://localhost/
 15/03/17 11:17:44 WARN util.NativeCodeLoader: Unable to load native-hadoop 
 library for your platform... using builtin-java classes where applicable
 15/03/17 11:17:45 INFO tools.DistCp: Input Options: 
 DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, 
 ignoreFailures=false, maxMaps=20, sslConfigurationFile='null', 
 copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[pom.xml], 
 targetPath=hdfs://localhost/, targetPathExists=true, preserveRawXattrs=false}
 15/03/17 11:17:45 INFO Configuration.deprecation: session.id is deprecated. 
 Instead, use dfs.metrics.session-id
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with 
 processName=JobTracker, sessionId=
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.mb is deprecated. 
 Instead, use mapreduce.task.io.sort.mb
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.factor is 
 deprecated. Instead, use mapreduce.task.io.sort.factor
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with 
 processName=JobTracker, sessionId= - already initialized
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: number of splits:1
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
 job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: The url to track the job: 
 http://localhost:8080/
 15/03/17 11:17:46 INFO tools.DistCp: DistCp job-id: job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: Running job: job_local992233322_0001
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter set in config 
 null
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter is 
 org.apache.hadoop.tools.mapred.CopyCommitter
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Waiting for map tasks
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Starting task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree 
 currently is supported only on Linux.
 15/03/17 11:17:46 INFO mapred.Task:  Using ResourceCalculatorProcessTree : 
 null
 15/03/17 11:17:46 INFO mapred.MapTask: Processing split: 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/fileList.seq:0+220
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.CopyMapper: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.CopyMapper: Skipping copy of 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.LocalJobRunner:
 15/03/17 11:17:46 INFO mapred.Task: 
 Task:attempt_local992233322_0001_m_00_0 is done. And is in the process of 
 committing
 15/03/17 11:17:46 INFO mapred.LocalJobRunner:
 15/03/17 11:17:46 INFO mapred.Task: Task 
 attempt_local992233322_0001_m_00_0 is allowed to commit now
 15/03/17 11:17:46 INFO output.FileOutputCommitter: Saved output of task 
 'attempt_local992233322_0001_m_00_0' to 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/_logs/_temporary/0/task_local992233322_0001_m_00
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.Task: Task 
 'attempt_local992233322_0001_m_00_0' done.
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Finishing task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: map task executor complete.
 

[jira] [Commented] (HADOOP-11738) Protocol Buffers 2.5 no longer available for download

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379269#comment-14379269
 ] 

Tsuyoshi Ozawa commented on HADOOP-11738:
-

Great news: Protobuf community has created a tag for 2.5.0! 
https://github.com/google/protobuf/issues/251#issuecomment-85767877


 Protocol Buffers 2.5 no longer available for download
 -

 Key: HADOOP-11738
 URL: https://issues.apache.org/jira/browse/HADOOP-11738
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Tsuyoshi Ozawa

 From REEF-216:
 {quote}
 Google recently switched off Google Code. They transferred the
 Protocol Buffers project to
 [GitHub|https://github.com/google/protobuf], and binaries are
 available from [Google's developer
 page|https://developers.google.com/protocol-buffers/docs/downloads].
 However, only the most recent version is available. We use version 2.5
 to be compatible with Hadoop. That version isn't available for
 download. 
 {quote}
 Our BUILDING.txt has same issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11738) Fix link of Protocol Buffers 2.5 for download in BUILDING.txt

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tsuyoshi Ozawa updated HADOOP-11738:

Summary: Fix link of Protocol Buffers 2.5 for download in BUILDING.txt   
(was: Protocol Buffers 2.5 no longer available for download)

 Fix link of Protocol Buffers 2.5 for download in BUILDING.txt 
 --

 Key: HADOOP-11738
 URL: https://issues.apache.org/jira/browse/HADOOP-11738
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Tsuyoshi Ozawa

 From REEF-216:
 {quote}
 Google recently switched off Google Code. They transferred the
 Protocol Buffers project to
 [GitHub|https://github.com/google/protobuf], and binaries are
 available from [Google's developer
 page|https://developers.google.com/protocol-buffers/docs/downloads].
 However, only the most recent version is available. We use version 2.5
 to be compatible with Hadoop. That version isn't available for
 download. 
 {quote}
 Our BUILDING.txt has same issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11406) xargs -P is not portable

2015-03-24 Thread Kengo Seki (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379279#comment-14379279
 ] 

Kengo Seki commented on HADOOP-11406:
-

Thanks [~aw], it sounds better. I'll do as follows:

- in hadoop-functions.sh, move the for loop into another function as the 
default.
- put the paralell xargs one into hadoop-user-functions.sh.example as a example 
to override above.


 xargs -P is not portable
 

 Key: HADOOP-11406
 URL: https://issues.apache.org/jira/browse/HADOOP-11406
 Project: Hadoop Common
  Issue Type: Bug
  Components: scripts
Affects Versions: 3.0.0
 Environment: Solaris
 Illumos
 AIX
 ... likely others
Reporter: Allen Wittenauer
Assignee: Kengo Seki
Priority: Critical
 Attachments: HADOOP-11406.001.patch


 hadoop-functions.sh uses xargs -P in the ssh handler.  -P is a GNU extension 
 and is not available on all operating systems.  We should add some detection 
 for support and perform an appropriate action.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11738) Fix a link of Protocol Buffers 2.5 for download in BUILDING.txt

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tsuyoshi Ozawa updated HADOOP-11738:

Summary: Fix a link of Protocol Buffers 2.5 for download in BUILDING.txt   
(was: Fix link of Protocol Buffers 2.5 for download in BUILDING.txt )

 Fix a link of Protocol Buffers 2.5 for download in BUILDING.txt 
 

 Key: HADOOP-11738
 URL: https://issues.apache.org/jira/browse/HADOOP-11738
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Tsuyoshi Ozawa

 From REEF-216:
 {quote}
 Google recently switched off Google Code. They transferred the
 Protocol Buffers project to
 [GitHub|https://github.com/google/protobuf], and binaries are
 available from [Google's developer
 page|https://developers.google.com/protocol-buffers/docs/downloads].
 However, only the most recent version is available. We use version 2.5
 to be compatible with Hadoop. That version isn't available for
 download. 
 {quote}
 Our BUILDING.txt has same issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11724) DistCp throws NPE when the target directory is root.

2015-03-24 Thread Lei (Eddy) Xu (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lei (Eddy) Xu updated HADOOP-11724:
---
Attachment: HADOOP-11724.001.patch

Thanks a lot [~yzhangal]. It is a great suggestion. I updated the patch for 
your comments. Would you mind take another look? 

 DistCp throws NPE when the target directory is root.
 

 Key: HADOOP-11724
 URL: https://issues.apache.org/jira/browse/HADOOP-11724
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Lei (Eddy) Xu
Assignee: Lei (Eddy) Xu
Priority: Minor
 Attachments: HADOOP-11724.000.patch, HADOOP-11724.001.patch


 Distcp throws NPE when the target directory is root. It is due to 
 {{CopyCommitter#cleanupTempFiles}} attempts to delete parent directory of 
 root, which is {{null}}:
 {code}
 $ hadoop distcp pom.xml hdfs://localhost/
 15/03/17 11:17:44 WARN util.NativeCodeLoader: Unable to load native-hadoop 
 library for your platform... using builtin-java classes where applicable
 15/03/17 11:17:45 INFO tools.DistCp: Input Options: 
 DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, 
 ignoreFailures=false, maxMaps=20, sslConfigurationFile='null', 
 copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[pom.xml], 
 targetPath=hdfs://localhost/, targetPathExists=true, preserveRawXattrs=false}
 15/03/17 11:17:45 INFO Configuration.deprecation: session.id is deprecated. 
 Instead, use dfs.metrics.session-id
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with 
 processName=JobTracker, sessionId=
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.mb is deprecated. 
 Instead, use mapreduce.task.io.sort.mb
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.factor is 
 deprecated. Instead, use mapreduce.task.io.sort.factor
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with 
 processName=JobTracker, sessionId= - already initialized
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: number of splits:1
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
 job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: The url to track the job: 
 http://localhost:8080/
 15/03/17 11:17:46 INFO tools.DistCp: DistCp job-id: job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: Running job: job_local992233322_0001
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter set in config 
 null
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter is 
 org.apache.hadoop.tools.mapred.CopyCommitter
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Waiting for map tasks
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Starting task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree 
 currently is supported only on Linux.
 15/03/17 11:17:46 INFO mapred.Task:  Using ResourceCalculatorProcessTree : 
 null
 15/03/17 11:17:46 INFO mapred.MapTask: Processing split: 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/fileList.seq:0+220
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.CopyMapper: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.CopyMapper: Skipping copy of 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.LocalJobRunner:
 15/03/17 11:17:46 INFO mapred.Task: 
 Task:attempt_local992233322_0001_m_00_0 is done. And is in the process of 
 committing
 15/03/17 11:17:46 INFO mapred.LocalJobRunner:
 15/03/17 11:17:46 INFO mapred.Task: Task 
 attempt_local992233322_0001_m_00_0 is allowed to commit now
 15/03/17 11:17:46 INFO output.FileOutputCommitter: Saved output of task 
 'attempt_local992233322_0001_m_00_0' to 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/_logs/_temporary/0/task_local992233322_0001_m_00
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.Task: Task 
 'attempt_local992233322_0001_m_00_0' done.
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Finishing task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: map task executor complete.
 15/03/17 11:17:46 INFO mapred.CopyCommitter: Remove parent: null for 
 hdfs://localhost/
 15/03/17 11:17:46 WARN 

[jira] [Assigned] (HADOOP-11733) mvn clean does not clean /tmp

2015-03-24 Thread Akira AJISAKA (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11733?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Akira AJISAKA reassigned HADOOP-11733:
--

Assignee: Akira AJISAKA

 mvn clean does not clean /tmp
 -

 Key: HADOOP-11733
 URL: https://issues.apache.org/jira/browse/HADOOP-11733
 Project: Hadoop Common
  Issue Type: Bug
  Components: build, test
Affects Versions: 2.6.0
 Environment: All
Reporter: Tony Reix
Assignee: Akira AJISAKA
Priority: Minor

 When Hadoop tests are run, files and directories are created in /tmp .
 Many (or all) of them are not cleaned when running mvn clean.
 This generates an issue when tests were previously run (by mistake) as root 
 and then run again as a non-root user, since directories (like: .hdfs-nfs 
 hadoop-root hsperfdata_root) cannot be written by second test run.
 List of files/directories that seem to be created in /tmp when testing Hadoop:
Jetty* *test *deferred .hdfs-nfs hadoop-root hsperfdata_root
 and that should be cleaned after tests.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11724) DistCp throws NPE when the target directory is root.

2015-03-24 Thread Lei (Eddy) Xu (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14379330#comment-14379330
 ] 

Lei (Eddy) Xu commented on HADOOP-11724:


No test is included, because the change is trivial.

 DistCp throws NPE when the target directory is root.
 

 Key: HADOOP-11724
 URL: https://issues.apache.org/jira/browse/HADOOP-11724
 Project: Hadoop Common
  Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Lei (Eddy) Xu
Assignee: Lei (Eddy) Xu
Priority: Minor
 Attachments: HADOOP-11724.000.patch, HADOOP-11724.001.patch


 Distcp throws NPE when the target directory is root. It is due to 
 {{CopyCommitter#cleanupTempFiles}} attempts to delete parent directory of 
 root, which is {{null}}:
 {code}
 $ hadoop distcp pom.xml hdfs://localhost/
 15/03/17 11:17:44 WARN util.NativeCodeLoader: Unable to load native-hadoop 
 library for your platform... using builtin-java classes where applicable
 15/03/17 11:17:45 INFO tools.DistCp: Input Options: 
 DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, 
 ignoreFailures=false, maxMaps=20, sslConfigurationFile='null', 
 copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[pom.xml], 
 targetPath=hdfs://localhost/, targetPathExists=true, preserveRawXattrs=false}
 15/03/17 11:17:45 INFO Configuration.deprecation: session.id is deprecated. 
 Instead, use dfs.metrics.session-id
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with 
 processName=JobTracker, sessionId=
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.mb is deprecated. 
 Instead, use mapreduce.task.io.sort.mb
 15/03/17 11:17:45 INFO Configuration.deprecation: io.sort.factor is 
 deprecated. Instead, use mapreduce.task.io.sort.factor
 15/03/17 11:17:45 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with 
 processName=JobTracker, sessionId= - already initialized
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: number of splits:1
 15/03/17 11:17:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
 job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: The url to track the job: 
 http://localhost:8080/
 15/03/17 11:17:46 INFO tools.DistCp: DistCp job-id: job_local992233322_0001
 15/03/17 11:17:46 INFO mapreduce.Job: Running job: job_local992233322_0001
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter set in config 
 null
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: OutputCommitter is 
 org.apache.hadoop.tools.mapred.CopyCommitter
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Waiting for map tasks
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Starting task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree 
 currently is supported only on Linux.
 15/03/17 11:17:46 INFO mapred.Task:  Using ResourceCalculatorProcessTree : 
 null
 15/03/17 11:17:46 INFO mapred.MapTask: Processing split: 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/fileList.seq:0+220
 15/03/17 11:17:46 INFO output.FileOutputCommitter: File Output Committer 
 Algorithm version is 1
 15/03/17 11:17:46 INFO mapred.CopyMapper: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.CopyMapper: Skipping copy of 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.LocalJobRunner:
 15/03/17 11:17:46 INFO mapred.Task: 
 Task:attempt_local992233322_0001_m_00_0 is done. And is in the process of 
 committing
 15/03/17 11:17:46 INFO mapred.LocalJobRunner:
 15/03/17 11:17:46 INFO mapred.Task: Task 
 attempt_local992233322_0001_m_00_0 is allowed to commit now
 15/03/17 11:17:46 INFO output.FileOutputCommitter: Saved output of task 
 'attempt_local992233322_0001_m_00_0' to 
 file:/tmp/hadoop/mapred/staging/lei2046334351/.staging/_distcp-1889397390/_logs/_temporary/0/task_local992233322_0001_m_00
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Copying 
 file:/Users/lei/work/cloudera/s3a_cp_target/pom.xml to 
 hdfs://localhost/pom.xml
 15/03/17 11:17:46 INFO mapred.Task: Task 
 'attempt_local992233322_0001_m_00_0' done.
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: Finishing task: 
 attempt_local992233322_0001_m_00_0
 15/03/17 11:17:46 INFO mapred.LocalJobRunner: map task executor complete.
 15/03/17 11:17:46 INFO mapred.CopyCommitter: Remove parent: null for 
 hdfs://localhost/
 15/03/17 11:17:46 WARN mapred.CopyCommitter: Unable to cleanup temp files
 

[jira] [Commented] (HADOOP-11741) Add LOG.isDebugEnabled() guard for some LOG.debug(..)

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378082#comment-14378082
 ] 

Tsuyoshi Ozawa commented on HADOOP-11741:
-

[~walter.k.su] thank you for taking this issue. LGTM overall. One minor nits: 
There is a line here that exceed 80 characters. Could you fix it?
{code}
   LOG.debug(RpcKind =  + rpcKind +  Protocol Name =  + protocolName + 
 version= + version +
{code}



 Add LOG.isDebugEnabled() guard for some LOG.debug(..)
 -

 Key: HADOOP-11741
 URL: https://issues.apache.org/jira/browse/HADOOP-11741
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Walter Su
Assignee: Walter Su
 Attachments: HADOOP-11741.001.patch


 {{isDebugEnabled()}} is optional. But when there are :
 1. lots of concatenating Strings
 2. complicated function calls
 in the arguments, {{LOG.debug(..)}} should be guarded with 
 {{LOG.isDebugEnabled()}} to avoid unnecessary arguments evaluation and impove 
 performance.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11524) hadoop_do_classpath_subcommand throws a shellcheck warning

2015-03-24 Thread Chris Nauroth (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chris Nauroth updated HADOOP-11524:
---
Status: Patch Available  (was: Open)

 hadoop_do_classpath_subcommand throws a shellcheck warning
 --

 Key: HADOOP-11524
 URL: https://issues.apache.org/jira/browse/HADOOP-11524
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Chris Nauroth
Priority: Minor
 Attachments: HADOOP-11524.001.patch


 {code}
 CLASS=org.apache.hadoop.util.Classpath
 ^-- SC2034: CLASS appears unused. Verify it or export it.
 {code}
 We should probably use a local var here and return it or something, even 
 though CLASS is technically a global.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11731:
--
Attachment: HADOOP-11731-04.patch

Correct -04 patch this time.

 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11731-00.patch, HADOOP-11731-01.patch, 
 HADOOP-11731-03.patch, HADOOP-11731-04.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11743) maven doesn't clean all the site files

2015-03-24 Thread Allen Wittenauer (JIRA)
Allen Wittenauer created HADOOP-11743:
-

 Summary: maven doesn't clean all the site files
 Key: HADOOP-11743
 URL: https://issues.apache.org/jira/browse/HADOOP-11743
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer


After building the site files, performing a mvn clean doesn't actually clean 
everything up as git complains about untracked files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11731:
--
Status: Patch Available  (was: Open)

 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11553-04.patch, HADOOP-11731-00.patch, 
 HADOOP-11731-01.patch, HADOOP-11731-03.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11731:
--
Attachment: HADOOP-11553-04.patch

-04:
* self generating indices
* site index hook into that release index
* -Preleasedocs (default is off) added to lower JIRA impact, offline still 
works (minus this), etc
* building updated
* release date detection for already released versions
* some code cleanup as I learn more python

 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11553-04.patch, HADOOP-11731-00.patch, 
 HADOOP-11731-01.patch, HADOOP-11731-03.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11731:
--
Status: Open  (was: Patch Available)

 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11553-04.patch, HADOOP-11731-00.patch, 
 HADOOP-11731-01.patch, HADOOP-11731-03.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11731:
--
Attachment: (was: HADOOP-11553-04.patch)

 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11731-00.patch, HADOOP-11731-01.patch, 
 HADOOP-11731-03.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11731:
--
Release Note: 
* The release notes now only contains JIRA issues with incompatible changes and 
actual release notes.  The generated format has been changed from HTML to 
markdown.

* The changelog is now automatically generated from data stored in JIRA rather 
than manually maintained. The format has been changed from pure text to 
markdown as well as containing more of the information that was previously 
stored in the release notes.

* In order to generate the changes file, python must be installed.

* New -Preleasedocs profile added to maven in order to trigger this 
functionality.

  was:
* The release notes now only contains JIRA issues with incompatible changes and 
actual release notes.  The generated format has been changed from HTML to 
markdown.

* The changelog is now automatically generated from data stored in JIRA rather 
than manually maintained. The format has been changed from pure text to 
markdown as well as containing more of the information that was previously 
stored in the release notes.

* Site documentation generation now requires Python.


 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11731-00.patch, HADOOP-11731-01.patch, 
 HADOOP-11731-03.patch, HADOOP-11731-04.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11524) hadoop_do_classpath_subcommand throws a shellcheck warning

2015-03-24 Thread Chris Nauroth (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chris Nauroth updated HADOOP-11524:
---
Attachment: HADOOP-11524.001.patch

This patch avoids the warning by asking the caller to pass in the output 
variable name to receive the class and then eval'ing it.  I tested it through 
the {{hadoop}}, {{hdfs}}, {{yarn}} and {{mapred}} entry points.  [~aw], how 
does this look?

 hadoop_do_classpath_subcommand throws a shellcheck warning
 --

 Key: HADOOP-11524
 URL: https://issues.apache.org/jira/browse/HADOOP-11524
 Project: Hadoop Common
  Issue Type: Improvement
  Components: scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Chris Nauroth
Priority: Minor
 Attachments: HADOOP-11524.001.patch


 {code}
 CLASS=org.apache.hadoop.util.Classpath
 ^-- SC2034: CLASS appears unused. Verify it or export it.
 {code}
 We should probably use a local var here and return it or something, even 
 though CLASS is technically a global.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11664) Loading predefined EC schemas from configuration

2015-03-24 Thread Kai Zheng (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377361#comment-14377361
 ] 

Kai Zheng commented on HADOOP-11664:


Let's have it. Will update the patch including the mentioned XML file.

 Loading predefined EC schemas from configuration
 

 Key: HADOOP-11664
 URL: https://issues.apache.org/jira/browse/HADOOP-11664
 Project: Hadoop Common
  Issue Type: Sub-task
Reporter: Kai Zheng
Assignee: Kai Zheng
 Attachments: HADOOP-11664-v2.patch, HDFS-7371_v1.patch


 System administrator can configure multiple EC codecs in hdfs-site.xml file, 
 and codec instances or schemas in a new configuration file named 
 ec-schema.xml in the conf folder. A codec can be referenced by its instance 
 or schema using the codec name, and a schema can be utilized and specified by 
 the schema name for a folder or EC ZONE to enforce EC. Once a schema is used 
 to define an EC ZONE, then its associated parameter values will be stored as 
 xattributes and respected thereafter.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Chris Nauroth (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378501#comment-14378501
 ] 

Chris Nauroth commented on HADOOP-11553:


Hi Allen.  Thank you for putting together this documentation.

For shelldocmd.py, what Python version did you use in testing?  I don't know 
for sure that it's still the case, but I recall that fairly recent CentOS 
versions are pinned to Python 2.6 as part of the implementation of yum.  I'd 
like to suggest that we stick to base 2.6 with no additional modules required.  
(I don't see any red flags so far.)

UnixShellGuide.md doesn't appear to be hyperlinked from anywhere else in the 
documentation.  Shall we add it to the left nav?

Shall we omit all private non-replaceable functions from the documentation?  On 
the Java side, we filter out the private things.  It's still helpful for 
maintainers to have the full docs on these functions in the source code, but I 
don't think we need to publish it to end users.

{{hadoop_add_colonpath}} probably needs more context in the description.  We 
could mention that the default implementations of {{hadoop_add_javalibpath}} 
and {{hadoop_add_ldlibpath}} use this.  It doesn't currently mention that the 
first argument is the name of the out variable to receive the modification.  
Alternatively, I think we could declare this one as a private non-replaceable 
implementation detail and filter it out of the docs as per above comment.

Here are a few typos I spotted:
* {{is controlled via \[the shell\](CommandsManula.html)}}  (The hyperlink 
target should be CommandsManual.html.)
* {{HADOOP_CLIENT_OPTS=-Xmx1g -Dhadoop.socks.server=localhost:4000 hadoop fs 
-ls /tmp}}  (I think you meant to close the double-quote before the start of 
the hadoop command.)
* {{to run hadoop commands access the server}}  (I think this was supposed to 
be accessing.)
* In the .hadooprc example, the if statement has an extra closing curly brace 
around HADOOP_SERVER.
* {{There are many enironment variables}}  (It should be environment.)
* {{the series of `_OPT` variables}}  (I think this was supposed to be OPTS.)
* {{Advanced administrators may which to supplement}}  (This should be may 
wish.)
* {{provides the capabilities to do funcion overrides}}  (This should be 
function.)
* Different places in the doc say either run time or runtime.  Let's use 
one consistently, probably runtime.
* {{Hadoop's shell code has a \[function 
library\](./HadoopShellFunctionAPI.html)}}  (The hyperlink target doesn't match 
the actual file name, which is UnixShellAPI.html.)
* {{Print a message to stderr if –debug is turuned on}}  (This should use 
turned on.)

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378514#comment-14378514
 ] 

Allen Wittenauer commented on HADOOP-11553:
---

bq. For shelldocmd.py, what Python version did you use in testing?

I've been using what is on my Mac, which happens to be 2.7.5  I've been trying 
to avoid using anything too fancy, with 2.5 being the version I'm shooting for.

bq. My last comment was based on the 03 patch, and I fell just behind of you 
posting the 04 patch.

:D

I'll go through your comments to see what I missed in -04. 

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11553:
--
Attachment: HADOOP-11553-04.patch

-04:
* toc on the API guide
* integration into site index
* minor typo fixed in do_classpath
* minor cleanup in the shell guide

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11553:
--
Status: Patch Available  (was: Open)

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Comment Edited] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378481#comment-14378481
 ] 

Allen Wittenauer edited comment on HADOOP-11553 at 3/24/15 8:06 PM:


-04:
* toc on the API guide
* integration into site index
* minor typo fixed in do_classpath
* minor cleanup in the shell guide
* release warning on the shell guide fixed


was (Author: aw):
-04:
* toc on the API guide
* integration into site index
* minor typo fixed in do_classpath
* minor cleanup in the shell guide

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11553:
--
Status: Open  (was: Patch Available)

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Chris Nauroth (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378505#comment-14378505
 ] 

Chris Nauroth commented on HADOOP-11553:


My last comment was based on the 03 patch, and I fell just behind of you 
posting the 04 patch.  :-)

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11731) Rework the changelog and releasenotes

2015-03-24 Thread Hadoop QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378528#comment-14378528
 ] 

Hadoop QA commented on HADOOP-11731:


{color:green}+1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12706981/HADOOP-11731-04.patch
  against trunk revision a16bfff.

{color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

{color:green}+0 tests included{color}.  The patch appears to be a 
documentation patch that doesn't require tests.

{color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

{color:green}+1 javadoc{color}.  There were no new javadoc warning messages.

{color:green}+1 eclipse:eclipse{color}.  The patch built with 
eclipse:eclipse.

{color:green}+1 findbugs{color}.  The patch does not introduce any new 
Findbugs (version 2.0.3) warnings.

{color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

{color:green}+1 core tests{color}.  The patch passed unit tests in 
hadoop-common-project/hadoop-common.

Test results: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5989//testReport/
Console output: 
https://builds.apache.org/job/PreCommit-HADOOP-Build/5989//console

This message is automatically generated.

 Rework the changelog and releasenotes
 -

 Key: HADOOP-11731
 URL: https://issues.apache.org/jira/browse/HADOOP-11731
 Project: Hadoop Common
  Issue Type: New Feature
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
 Attachments: HADOOP-11731-00.patch, HADOOP-11731-01.patch, 
 HADOOP-11731-03.patch, HADOOP-11731-04.patch


 The current way we generate these build artifacts is awful.  Plus they are 
 ugly and, in the case of release notes, very hard to pick out what is 
 important.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11738) Protocol Buffers 2.5 no longer available for download

2015-03-24 Thread Tsuyoshi Ozawa (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14377468#comment-14377468
 ] 

Tsuyoshi Ozawa commented on HADOOP-11738:
-

Based on the commit log, I created v2.5.0 tag and created tar.gz file for the 
community: https://github.com/oza/protobuf/releases/tag/v2.5.0

One idea is to change BUILDING.txt to point the link as a temporal solution. Do 
you have any idea?

 Protocol Buffers 2.5 no longer available for download
 -

 Key: HADOOP-11738
 URL: https://issues.apache.org/jira/browse/HADOOP-11738
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Tsuyoshi Ozawa

 From REEF-216:
 {quote}
 Google recently switched off Google Code. They transferred the
 Protocol Buffers project to
 [GitHub|https://github.com/google/protobuf], and binaries are
 available from [Google's developer
 page|https://developers.google.com/protocol-buffers/docs/downloads].
 However, only the most recent version is available. We use version 2.5
 to be compatible with Hadoop. That version isn't available for
 download. 
 {quote}
 Our BUILDING.txt has same issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11743) maven doesn't clean all the site files

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11743?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11743:
--
Priority: Minor  (was: Major)

 maven doesn't clean all the site files
 --

 Key: HADOOP-11743
 URL: https://issues.apache.org/jira/browse/HADOOP-11743
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Priority: Minor

 After building the site files, performing a mvn clean doesn't actually clean 
 everything up as git complains about untracked files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HADOOP-11744) Support OAuth2 in Hadoop

2015-03-24 Thread Haohui Mai (JIRA)
Haohui Mai created HADOOP-11744:
---

 Summary: Support OAuth2 in Hadoop
 Key: HADOOP-11744
 URL: https://issues.apache.org/jira/browse/HADOOP-11744
 Project: Hadoop Common
  Issue Type: New Feature
Reporter: Haohui Mai


OAuth2 is a standardize mechanism for authentication and authorization. A 
notable use case of OAuth2 is SSO -- it would be nice to integrate OAuth2 with 
Hadoop.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11553:
--
Attachment: HADOOP-11553-05.patch

-05:
* Address [~cnauroth]'s feedback (I think)
* shelldocs.py: added the extremely obvious --skipprnorep option to prevent 
private + not replaceable functions from being output'ed


 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch, 
 HADOOP-11553-05.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11553:
--
Status: Patch Available  (was: Open)

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch, 
 HADOOP-11553-05.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HADOOP-11553) Formalize the shell API

2015-03-24 Thread Allen Wittenauer (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-11553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Allen Wittenauer updated HADOOP-11553:
--
Status: Open  (was: Patch Available)

 Formalize the shell API
 ---

 Key: HADOOP-11553
 URL: https://issues.apache.org/jira/browse/HADOOP-11553
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, scripts
Affects Versions: 3.0.0
Reporter: Allen Wittenauer
Assignee: Allen Wittenauer
Priority: Blocker
 Attachments: HADOOP-11553-00.patch, HADOOP-11553-01.patch, 
 HADOOP-11553-02.patch, HADOOP-11553-03.patch, HADOOP-11553-04.patch, 
 HADOOP-11553-05.patch


 After HADOOP-11485, we need to formally document functions and environment 
 variables that 3rd parties can expect to be able to exist/use.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HADOOP-11701) RPC authentication fallback option should support enabling fallback only for specific connections.

2015-03-24 Thread Chris Nauroth (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-11701?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14378643#comment-14378643
 ] 

Chris Nauroth commented on HADOOP-11701:


Hi Yongjun.  I think we'd want a client-specified policy that describes which 
clusters for which it is willing to use fallback.  One possible implementation 
choice is a configuration property that contains a list of network addresses 
(host + port) for which fallback is acceptable.  This is nice for usability 
too.  A cluster administrator could put it into core-site.xml for all jobs to 
use, and then users wouldn't need to specify 
{{-Dipc.client.fallback-to-simple-auth-allowed=true}} manually on individual 
jobs.

 RPC authentication fallback option should support enabling fallback only for 
 specific connections.
 --

 Key: HADOOP-11701
 URL: https://issues.apache.org/jira/browse/HADOOP-11701
 Project: Hadoop Common
  Issue Type: Improvement
  Components: ipc, security
Reporter: Chris Nauroth

 We currently support the {{ipc.client.fallback-to-simple-auth-allowed}} 
 configuration property so that a client configured with security can fallback 
 to simple authentication when communicating with an unsecured server.  This 
 is a global property that enables the fallback behavior for all RPC 
 connections, even though fallback is only desirable for clusters that are 
 known to be unsecured.  This issue proposes to support configurability of 
 fallback on specific connections, not all connections globally.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)