See <https://builds.apache.org/job/Hadoop-0.20.204-Build/1/>

------------------------------------------
[...truncated 8604 lines...]
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... 
ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-utils:
     [exec] make[1]: Entering directory 
`<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>
     [exec] test -z 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib";>
 || mkdir -p -- 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib";>
     [exec]  /usr/bin/install -c -m 644 'libhadooputils.a' 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec]  ranlib 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec] test -z 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop";>
 || mkdir -p -- 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop";>
     [exec]  /usr/bin/install -c -m 644 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/StringUtils.hh'>
 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/StringUtils.hh'>
     [exec]  /usr/bin/install -c -m 644 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/SerialUtils.hh'>
 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/SerialUtils.hh'>
     [exec] make[1]: Leaving directory 
`<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>

compile-c++-pipes:
     [exec] depbase=`echo impl/HadoopPipes.o | sed 
's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec]     if g++ -DHAVE_CONFIG_H -I. 
-I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes> 
-I./impl    
-I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api>
 -Wall 
-I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include>
 -g -O2 -MT impl/HadoopPipes.o -MD -MP -MF "$depbase.Tpo" -c -o 
impl/HadoopPipes.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc;>
 \
     [exec]     then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f 
"$depbase.Tpo"; exit 1; fi
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:
 In member function 'void HadoopPipes::TextUpwardProtocol::writeBuffer(const 
std::string&)':
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:129:
 warning: format not a string literal and no format arguments
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:
 In member function 'std::string 
HadoopPipes::BinaryProtocol::createDigest(std::string&, std::string&)':
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:439:
 warning: value computed is not used
     [exec] rm -f libhadooppipes.a
     [exec] ar cru libhadooppipes.a impl/HadoopPipes.o 
     [exec] ranlib libhadooppipes.a
     [exec] make[1]: Entering directory 
`<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>
     [exec] test -z 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib";>
 || mkdir -p -- 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib";>
     [exec]  /usr/bin/install -c -m 644 'libhadooppipes.a' 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec]  ranlib 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec] test -z 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop";>
 || mkdir -p -- 
"<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop";>
     [exec]  /usr/bin/install -c -m 644 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/Pipes.hh'>
 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/Pipes.hh'>
     [exec]  /usr/bin/install -c -m 644 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/TemplateFactory.hh'>
 
'<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/TemplateFactory.hh'>
     [exec] make[1]: Leaving directory 
`<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>

compile-c++:

compile-core:

test-c++-libhdfs:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs>
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs>
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/hdfs/name>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" 
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" 
-DPACKAGE_BUGREPORT=\"omal...@apache.org\" -DPACKAGE=\"libhdfs\" 
-DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 
-DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 
-DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 
-DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 
-DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. 
-I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>
     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 
-I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes 
-MT hdfs_test.o -MD -MP -MF ".deps/hdfs_test.Tpo" -c -o hdfs_test.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c;>
 \
     [exec]     then mv -f ".deps/hdfs_test.Tpo" ".deps/hdfs_test.Po"; else rm 
-f ".deps/hdfs_test.Tpo"; exit 1; fi
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:
 In function `main':
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:87:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:90:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:130:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:133:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:188:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:189:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:190:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:198:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:199:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:220:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:221:
 warning: long int format, different type arg (arg 3)
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:272:
 warning: implicit declaration of function `sleep'
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX 
-DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  
-m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared 
-Wl,-x -o hdfs_test  hdfs_test.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>
  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" 
-m32 -I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes 
-m32 -Wl,-x -o hdfs_test hdfs_test.o  
-L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>
 -ljvm -ldl -lpthread -Wl,-rpath 
-Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
 -Wl,-rpath 
-Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" 
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" 
-DPACKAGE_BUGREPORT=\"omal...@apache.org\" -DPACKAGE=\"libhdfs\" 
-DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 
-DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 
-DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 
-DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 
-DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. 
-I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>
     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 
-I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes 
-MT hdfs_read.o -MD -MP -MF ".deps/hdfs_read.Tpo" -c -o hdfs_read.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c;>
 \
     [exec]     then mv -f ".deps/hdfs_read.Tpo" ".deps/hdfs_read.Po"; else rm 
-f ".deps/hdfs_read.Tpo"; exit 1; fi
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>:
 In function `main':
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>:35:
 warning: unused variable `fileTotalSize'
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX 
-DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  
-m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared 
-Wl,-x -o hdfs_read  hdfs_read.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>
  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" 
-m32 -I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes 
-m32 -Wl,-x -o hdfs_read hdfs_read.o  
-L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>
 -ljvm -ldl -lpthread -Wl,-rpath 
-Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
 -Wl,-rpath 
-Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" 
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" 
-DPACKAGE_BUGREPORT=\"omal...@apache.org\" -DPACKAGE=\"libhdfs\" 
-DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 
-DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 
-DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 
-DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 
-DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. 
-I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>
     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 
-I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes 
-MT hdfs_write.o -MD -MP -MF ".deps/hdfs_write.Tpo" -c -o hdfs_write.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_write.c;>
 \
     [exec]     then mv -f ".deps/hdfs_write.Tpo" ".deps/hdfs_write.Po"; else 
rm -f ".deps/hdfs_write.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX 
-DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  
-m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared 
-Wl,-x -o hdfs_write  hdfs_write.o 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>
  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" 
-m32 -I/homes/hudson/tools/java/latest1.6/include 
-I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes 
-m32 -Wl,-x -o hdfs_write hdfs_write.o  
-L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>
 -ljvm -ldl -lpthread -Wl,-rpath 
-Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
 -Wl,-rpath 
-Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/test-libhdfs.sh>
        
     [exec] 
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] LIB_JVM_DIR = /homes/hudson/tools/java/latest1.6/jre/lib/i386/server
     [exec] 
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop>: line 
53: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>:
 No such file or directory
     [exec] 11/07/08 18:08:07 WARN conf.Configuration: DEPRECATED: 
hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. 
Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override 
properties of core-default.xml, mapred-default.xml and hdfs-default.xml 
respectively
     [exec] 11/07/08 18:08:07 INFO namenode.NameNode: STARTUP_MSG: 
     [exec] /************************************************************
     [exec] STARTUP_MSG: Starting NameNode
     [exec] STARTUP_MSG:   host = h4.grid.sp2.yahoo.net/127.0.1.1
     [exec] STARTUP_MSG:   args = [-format]
     [exec] STARTUP_MSG:   version = 0.20.204
     [exec] STARTUP_MSG:   build = 
http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 
-r 1137306; compiled by 'hudson' on Fri Jul  8 18:07:17 UTC 2011
     [exec] ************************************************************/
     [exec] 11/07/08 18:08:07 INFO util.GSet: VM type       = 32-bit
     [exec] 11/07/08 18:08:07 INFO util.GSet: 2% max memory = 17.77875 MB
     [exec] 11/07/08 18:08:07 INFO util.GSet: capacity      = 2^22 = 4194304 
entries
     [exec] 11/07/08 18:08:07 INFO util.GSet: recommended=4194304, 
actual=4194304
     [exec] 11/07/08 18:08:07 INFO namenode.FSNamesystem: fsOwner=hudson
     [exec] 11/07/08 18:08:07 INFO namenode.FSNamesystem: supergroup=supergroup
     [exec] 11/07/08 18:08:07 INFO namenode.FSNamesystem: 
isPermissionEnabled=true
     [exec] 11/07/08 18:08:07 INFO namenode.FSNamesystem: 
dfs.block.invalidate.limit=100
     [exec] 11/07/08 18:08:07 INFO namenode.FSNamesystem: 
isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), 
accessTokenLifetime=0 min(s)
     [exec] 11/07/08 18:08:07 INFO namenode.NameNode: Caching file names 
occuring more than 10 times 
     [exec] 11/07/08 18:08:08 INFO common.Storage: Image file of size 112 saved 
in 0 seconds.
     [exec] 11/07/08 18:08:08 INFO common.Storage: Storage directory 
build/test/libhdfs/dfs/name has been successfully formatted.
     [exec] 11/07/08 18:08:08 INFO namenode.NameNode: SHUTDOWN_MSG: 
     [exec] /************************************************************
     [exec] SHUTDOWN_MSG: Shutting down NameNode at 
h4.grid.sp2.yahoo.net/127.0.1.1
     [exec] ************************************************************/
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>:
 line 42: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>:
 No such file or directory
     [exec] starting namenode, logging to 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-namenode-h4.grid.sp2.yahoo.net.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>:
 line 42: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>:
 No such file or directory
     [exec] starting datanode, logging to 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-datanode-h4.grid.sp2.yahoo.net.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] 
CLASSPATH=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:/homes/hudson/tools/java/latest1.6/lib/tools.jar:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/jsp-2.0/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjrt-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjtools-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-1.7.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-core-1.8.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-cli-1.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-codec-1.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-collections-3.2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-configuration-1.6.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-daemon-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-digester-1.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-el-1.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-httpclient-3.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-lang-2.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-1.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-api-1.0.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-math-2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-net-1.4.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/core-3.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-core-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-mapper-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-compiler-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-runtime-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jdeb-0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jets3t-0.6.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-util-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jsch-0.1.42.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/junit-4.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/log4j-1.2.15.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/mockito-all-1.8.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/oro-2.0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/servlet-api-2.5-20081211.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-api-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-log4j12-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/xmlenc-0.52.jar>
 
LD_PRELOAD=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>:/homes/hudson/tools/java/latest1.6/jre/lib/i386/server/libjvm.so
 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/libhdfs/hdfs_test>
     [exec] 11/07/08 18:08:34 WARN conf.Configuration: DEPRECATED: 
hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. 
Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override 
properties of core-default.xml, mapred-default.xml and hdfs-default.xml 
respectively
     [exec] 11/07/08 18:08:34 WARN fs.FileSystem: "localhost:23000" is a 
deprecated filesystem name. Use "hdfs://localhost:23000/" instead.
     [exec] 11/07/08 18:08:36 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 0 time(s).
     [exec] 11/07/08 18:08:37 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 1 time(s).
     [exec] 11/07/08 18:08:38 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 2 time(s).
     [exec] 11/07/08 18:08:39 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 3 time(s).
     [exec] 11/07/08 18:08:40 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 4 time(s).
     [exec] 11/07/08 18:08:41 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 5 time(s).
     [exec] 11/07/08 18:08:42 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 6 time(s).
     [exec] 11/07/08 18:08:43 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 7 time(s).
     [exec] 11/07/08 18:08:44 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 8 time(s).
     [exec] 11/07/08 18:08:45 INFO ipc.Client: Retrying connect to server: 
localhost/127.0.0.1:23000. Already tried 9 time(s).
     [exec] Exception in thread "main" java.net.ConnectException: Call to 
localhost/127.0.0.1:23000 failed on connection exception: 
java.net.ConnectException: Connection refused
     [exec]     at org.apache.hadoop.ipc.Client.wrapException(Client.java:1057)
     [exec]     at org.apache.hadoop.ipc.Client.call(Client.java:1033)
     [exec]     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:224)
     [exec]     at $Proxy1.getProtocolVersion(Unknown Source)
     [exec]     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
     [exec]     at 
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
     [exec]     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208)
     [exec]     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:175)
     [exec]     at 
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
     [exec]     at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1310)
     [exec]     at 
org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:65)
     [exec]     at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1328)
     [exec]     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
     [exec]     at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:103)
     [exec]     at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:101)
     [exec]     at java.security.AccessController.doPrivileged(Native Method)
     [exec]     at javax.security.auth.Subject.doAs(Subject.java:396)
     [exec]     at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
     [exec]     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:101)
     [exec] Caused by: java.net.ConnectException: Connection refused
     [exec]     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
     [exec]     at 
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
     [exec]     at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
     [exec]     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:406)
     [exec]     at 
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:414)
     [exec]     at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:527)
     [exec]     at 
org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:187)
     [exec]     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1164)
     [exec]     at org.apache.hadoop.ipc.Client.call(Client.java:1010)
     [exec]     ... 17 more
     [exec] Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) 
failed!
     [exec] Oops! Failed to connect to hdfs!
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>:
 line 42: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>:
 No such file or directory
     [exec] no datanode to stop
     [exec] 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>:
 line 42: 
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>:
 No such file or directory
     [exec] no namenode to stop
     [exec] exiting with 255
     [exec] make: *** [test] Error 255

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1857: 
exec returned: 2

Total time: 6 minutes 11 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 

Reply via email to