[jira] [Resolved] (HADOOP-7297) Error in the documentation regarding Checkpoint/Backup Node

2011-05-19 Thread Suresh Srinivas (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-7297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Suresh Srinivas resolved HADOOP-7297.
-

Resolution: Not A Problem

I do not think this is a problem for 0.20.203.

 Error in the documentation regarding Checkpoint/Backup Node
 ---

 Key: HADOOP-7297
 URL: https://issues.apache.org/jira/browse/HADOOP-7297
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation
Affects Versions: 0.20.203.0
Reporter: arnaud p
Priority: Trivial

 On 
 http://hadoop.apache.org/common/docs/r0.20.203.0/hdfs_user_guide.html#Checkpoint+Node:
  the command bin/hdfs namenode -checkpoint required to launch the 
 backup/checkpoint node does not exist.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-7304) BackUpNameNode is using 100% CPU and not accepting any requests.

2011-05-19 Thread ramkrishna.s.vasudevan (JIRA)
BackUpNameNode is using 100% CPU and not accepting any requests. 
-

 Key: HADOOP-7304
 URL: https://issues.apache.org/jira/browse/HADOOP-7304
 Project: Hadoop Common
  Issue Type: Bug
  Components: ipc
Affects Versions: 0.21.0
Reporter: ramkrishna.s.vasudevan
 Fix For: 0.20-append


In our environment, Backup NameNode is using 100% CPU and not accepting any 
calls in 3days long run.
Thread dump 
IPC Server Responder daemon prio=10 tid=0x7f86c41c6800 nid=0x3b2a 
runnable [0x7f86ce579000]
java.lang.Thread.State: RUNNABLE
at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:215)
at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:65)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:69) 
locked 0x7f86d67e2a20 (a sun.nio.ch.Util$1) 
locked 0x7f86d67e2a08 (a java.util.Collections$UnmodifiableSet) 
locked 0x7f86d67e26a8 (a sun.nio.ch.EPollSelectorImpl)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:80)
at org.apache.hadoop.ipc.Server$Responder.run(Server.java:501) 
Looks like same issue occurred in jetty also. 
http://jira.codehaus.org/browse/JETTY-937 




 



--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: Eclipse target

2011-05-19 Thread Niels Basjes
Hi Todd,

2011/5/19 Todd Lipcon t...@cloudera.com:
 Yes, I have to do this same thing manually every time I re-run ant eclipse.

 So, it seems like we should add it to the eclipse target, like you
 said. Feel free to file a JIRA and patch!

https://issues.apache.org/jira/browse/HADOOP-7305
(Hadoop QA should start in a few minutes to validate this).

-- 
Met vriendelijke groeten,

Niels Basjes


[jira] [Reopened] (HADOOP-7297) Error in the documentation regarding Checkpoint/Backup Node

2011-05-19 Thread Harsh J Chouraria (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-7297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Harsh J Chouraria reopened HADOOP-7297:
---


Reopening since the issue of docs is valid. There are CN and BN node docs on 
the tagged svn rev: 
http://svn.apache.org/repos/asf/hadoop/common/tags/release-0.20.203.0/src/docs/src/documentation/content/xdocs/hdfs_user_guide.xml

 Error in the documentation regarding Checkpoint/Backup Node
 ---

 Key: HADOOP-7297
 URL: https://issues.apache.org/jira/browse/HADOOP-7297
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation
Affects Versions: 0.20.203.0
Reporter: arnaud p
Priority: Trivial

 On 
 http://hadoop.apache.org/common/docs/r0.20.203.0/hdfs_user_guide.html#Checkpoint+Node:
  the command bin/hdfs namenode -checkpoint required to launch the 
 backup/checkpoint node does not exist.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-7306) Start metrics system even if config files are missing

2011-05-19 Thread Luke Lu (JIRA)
Start metrics system even if config files are missing
-

 Key: HADOOP-7306
 URL: https://issues.apache.org/jira/browse/HADOOP-7306
 Project: Hadoop Common
  Issue Type: Improvement
  Components: metrics
Affects Versions: 0.23.0
Reporter: Luke Lu
Assignee: Luke Lu
 Fix For: 0.23.0


Per experience and discussion with HDFS-1922, it seems preferable to treat 
missing metrics config file as empty/default config, which is more compatible 
with metrics v1 behavior (the MBeans are always registered.)

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: Eclipse target

2011-05-19 Thread Todd Lipcon
Committed. Thanks!

On Thu, May 19, 2011 at 5:14 AM, Niels Basjes ni...@basjes.nl wrote:
 Hi Todd,

 2011/5/19 Todd Lipcon t...@cloudera.com:
 Yes, I have to do this same thing manually every time I re-run ant eclipse.

 So, it seems like we should add it to the eclipse target, like you
 said. Feel free to file a JIRA and patch!

 https://issues.apache.org/jira/browse/HADOOP-7305
 (Hadoop QA should start in a few minutes to validate this).

 --
 Met vriendelijke groeten,

 Niels Basjes




-- 
Todd Lipcon
Software Engineer, Cloudera


[jira] [Created] (HADOOP-7309) improve trademark symbol usage and add trademark footer

2011-05-19 Thread Owen O'Malley (JIRA)
improve trademark symbol usage and add trademark footer
---

 Key: HADOOP-7309
 URL: https://issues.apache.org/jira/browse/HADOOP-7309
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Owen O'Malley


We only need to mention trademarks on the first usage, add a footer specifying 
the trademarks, and update the pdf files.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Created] (HADOOP-7311) Port remaining metrics v1 from trunk to branch-0.20-security

2011-05-19 Thread Konstantin Shvachko (JIRA)
Port remaining metrics v1 from trunk to branch-0.20-security


 Key: HADOOP-7311
 URL: https://issues.apache.org/jira/browse/HADOOP-7311
 Project: Hadoop Common
  Issue Type: Bug
  Components: metrics
Affects Versions: 0.20.203.0
Reporter: Konstantin Shvachko
Assignee: Konstantin Shvachko
 Fix For: 0.20.204.0


HADOOP-7190 added metrics packages/classes. This is a port from trunk to make 
them actually work for the security branch.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


Re: Hudson pre-commit job broken

2011-05-19 Thread Todd Lipcon
Must have been some svn bug. I rm -Rfed the http/lib directory in
question, ran svn cleanup, svn up, and it seems OK now.

-Todd

On Thu, May 19, 2011 at 4:14 PM, Aaron T. Myers a...@cloudera.com wrote:
 See this page:
 https://hudson.apache.org/hudson/job/PreCommit-Hadoop-Build/480/console

 https://hudson.apache.org/hudson/job/PreCommit-Hadoop-Build/480/consoleAnd
 note the following error:


     [exec] svn: This client is too old to work with working copy
 'src/test/core/org/apache/hadoop/http/lib'.  You need
     [exec] to get a newer Subversion client, or to downgrade this working 
 copy.
     [exec] See 
 http://subversion.tigris.org/faq.html#working-copy-format-change
     [exec] for details.


 --
 Aaron T. Myers
 Software Engineer, Cloudera




-- 
Todd Lipcon
Software Engineer, Cloudera


Re: Hudson pre-commit job broken

2011-05-19 Thread Todd Lipcon
Strange... looks like the same issue is happening on the other build
boxes too - I'd fixed h9 but h6 also has the issue.

On Thu, May 19, 2011 at 4:40 PM, Todd Lipcon t...@cloudera.com wrote:
 Must have been some svn bug. I rm -Rfed the http/lib directory in
 question, ran svn cleanup, svn up, and it seems OK now.

 -Todd

 On Thu, May 19, 2011 at 4:14 PM, Aaron T. Myers a...@cloudera.com wrote:
 See this page:
 https://hudson.apache.org/hudson/job/PreCommit-Hadoop-Build/480/console

 https://hudson.apache.org/hudson/job/PreCommit-Hadoop-Build/480/consoleAnd
 note the following error:


     [exec] svn: This client is too old to work with working copy
 'src/test/core/org/apache/hadoop/http/lib'.  You need
     [exec] to get a newer Subversion client, or to downgrade this working 
 copy.
     [exec] See 
 http://subversion.tigris.org/faq.html#working-copy-format-change
     [exec] for details.


 --
 Aaron T. Myers
 Software Engineer, Cloudera




 --
 Todd Lipcon
 Software Engineer, Cloudera




-- 
Todd Lipcon
Software Engineer, Cloudera


Re: Hudson pre-commit job broken

2011-05-19 Thread Nigel Daley
maybe it's time to update the slave.jar jenkins jar.  I'll do that.

Nige

On May 19, 2011, at 4:43 PM, Todd Lipcon wrote:

 Strange... looks like the same issue is happening on the other build
 boxes too - I'd fixed h9 but h6 also has the issue.
 
 On Thu, May 19, 2011 at 4:40 PM, Todd Lipcon t...@cloudera.com wrote:
 Must have been some svn bug. I rm -Rfed the http/lib directory in
 question, ran svn cleanup, svn up, and it seems OK now.
 
 -Todd
 
 On Thu, May 19, 2011 at 4:14 PM, Aaron T. Myers a...@cloudera.com wrote:
 See this page:
 https://hudson.apache.org/hudson/job/PreCommit-Hadoop-Build/480/console
 
 https://hudson.apache.org/hudson/job/PreCommit-Hadoop-Build/480/consoleAnd
 note the following error:
 
 
 [exec] svn: This client is too old to work with working copy
 'src/test/core/org/apache/hadoop/http/lib'.  You need
 [exec] to get a newer Subversion client, or to downgrade this working 
 copy.
 [exec] See 
 http://subversion.tigris.org/faq.html#working-copy-format-change
 [exec] for details.
 
 
 --
 Aaron T. Myers
 Software Engineer, Cloudera
 
 
 
 
 --
 Todd Lipcon
 Software Engineer, Cloudera
 
 
 
 
 -- 
 Todd Lipcon
 Software Engineer, Cloudera



[jira] [Created] (HADOOP-7312) core-default.xml lists configuration version as 0.21

2011-05-19 Thread Todd Lipcon (JIRA)
core-default.xml lists configuration version as 0.21


 Key: HADOOP-7312
 URL: https://issues.apache.org/jira/browse/HADOOP-7312
 Project: Hadoop Common
  Issue Type: Bug
  Components: conf
Affects Versions: 0.22.0
Reporter: Todd Lipcon
Priority: Minor
 Fix For: 0.22.0


This key was added in HADOOP-6233, though appears unused. I suppose it's 
somewhat useful to try to diagnose if someone has old versions of 
core-default.xml on the classpath.

Either way it should probably be updated to say 0.22 in the branch and 0.23 in 
trunk.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Resolved] (HADOOP-7283) Include 32-bit and 64-bit native libraries in Jenkins tarball builds

2011-05-19 Thread Tom White (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-7283?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tom White resolved HADOOP-7283.
---

Resolution: Fixed
  Assignee: Tom White

The build at 
https://builds.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-22-Build/ is 
now producing tarballs with the correct native libraries.

 Include 32-bit and 64-bit native libraries in Jenkins tarball builds
 

 Key: HADOOP-7283
 URL: https://issues.apache.org/jira/browse/HADOOP-7283
 Project: Hadoop Common
  Issue Type: Task
  Components: build
Reporter: Tom White
Assignee: Tom White
Priority: Blocker
 Fix For: 0.22.0


 The job at 
 https://builds.apache.org/hudson/view/G-L/view/Hadoop/job/Hadoop-22-Build/ is 
 building tarballs, but they do not currently include both 32-bit and 64-bit 
 native libraries. We should update/duplicate 
 hadoop-nighly/hudsonBuildHadoopRelease.sh to support post-split builds.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira


Build failed in Jenkins: Hadoop-0.20.203-Build #11

2011-05-19 Thread Apache Jenkins Server
See https://builds.apache.org/hudson/job/Hadoop-0.20.203-Build/11/changes

Changes:

[tomwhite] Include 32-bit and 64-bit native libraries in Jenkins tarball builds

--
[...truncated 11716 lines...]
 [exec] checking for xlf... no
 [exec] checking for f77... no
 [exec] checking for frt... no
 [exec] checking for pgf77... no
 [exec] checking for cf77... no
 [exec] checking for fort77... no
 [exec] checking for fl32... no
 [exec] checking for af77... no
 [exec] checking for xlf90... no
 [exec] checking for f90... no
 [exec] checking for pgf90... no
 [exec] checking for pghpf... no
 [exec] checking for epcf90... no
 [exec] checking for gfortran... no
 [exec] checking for g95... no
 [exec] checking for xlf95... no
 [exec] checking for f95... no
 [exec] checking for fort... no
 [exec] checking for ifort... no
 [exec] checking for ifc... no
 [exec] checking for efc... no
 [exec] checking for pgf95... no
 [exec] checking for lf95... no
 [exec] checking for ftn... no
 [exec] checking whether we are using the GNU Fortran 77 compiler... no
 [exec] checking whether  accepts -g... no
 [exec] checking the maximum length of command line arguments... 32768
 [exec] checking command to parse /usr/bin/nm -B output from gcc object... 
ok
 [exec] checking for objdir... .libs
 [exec] checking for ar... ar
 [exec] checking for ranlib... ranlib
 [exec] checking for strip... strip
 [exec] checking if gcc static flag  works... yes
 [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
 [exec] checking for gcc option to produce PIC... -fPIC
 [exec] checking if gcc PIC flag -fPIC works... yes
 [exec] checking if gcc supports -c -o file.o... yes
 [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
 [exec] checking whether -lc should be explicitly linked in... no
 [exec] checking dynamic linker characteristics... GNU/Linux ld.so
 [exec] checking how to hardcode library paths into programs... immediate
 [exec] checking whether stripping libraries is possible... yes
 [exec] checking if libtool supports shared libraries... yes
 [exec] checking whether to build shared libraries... yes
 [exec] checking whether to build static libraries... yes
 [exec] configure: creating libtool
 [exec] appending configuration tag CXX to libtool
 [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
 [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
 [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
 [exec] checking for g++ option to produce PIC... -fPIC
 [exec] checking if g++ PIC flag -fPIC works... yes
 [exec] checking if g++ supports -c -o file.o... yes
 [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
 [exec] checking dynamic linker characteristics... GNU/Linux ld.so
 [exec] checking how to hardcode library paths into programs... immediate
 [exec] checking whether stripping libraries is possible... yes
 [exec] appending configuration tag F77 to libtool
 [exec] checking for unistd.h... (cached) yes
 [exec] checking for stdbool.h that conforms to C99... yes
 [exec] checking for _Bool... no
 [exec] checking for an ANSI C-conforming const... yes
 [exec] checking for off_t... yes
 [exec] checking for size_t... yes
 [exec] checking whether strerror_r is declared... yes
 [exec] checking for strerror_r... yes
 [exec] checking whether strerror_r returns char *... yes
 [exec] checking for mkdir... yes
 [exec] checking for uname... yes
 [exec] checking for shutdown in -lsocket... no
 [exec] checking for xdr_float in -lnsl... yes
 [exec] configure: creating ./config.status
 [exec] config.status: creating Makefile
 [exec] config.status: creating impl/config.h
 [exec] config.status: impl/config.h is unchanged
 [exec] config.status: executing depfiles commands

compile-c++-examples-pipes:
 [exec] depbase=`echo impl/wordcount-simple.o | sed 
's|[^/]*$|.deps/|;s|\.o$||'`; \
 [exec] if g++ -DHAVE_CONFIG_H -I. 
-Ihttps://builds.apache.org/hudson/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes
 -I./impl-Wall 
-Ihttps://builds.apache.org/hudson/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include
 
-Ihttps://builds.apache.org/hudson/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include
 -g -O2 -MT impl/wordcount-simple.o -MD -MP -MF $depbase.Tpo -c -o 
impl/wordcount-simple.o 
https://builds.apache.org/hudson/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes/impl/wordcount-simple.cc;
 \
 [exec] then mv -f $depbase.Tpo $depbase.Po; else rm -f 
$depbase.Tpo; exit 1; fi
 [exec]