See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1064/changes>
Changes: [tucu] HADOOP-8368. Use CMake rather than autotools to build native code (ccccabe via tucu) [atm] HDFS-3442. Incorrect count for Missing Replicas in FSCK report. Contributed by Andrew Wang. [bobby] MAPREDUCE-4302. NM goes down if error encountered during log aggregation (Daryn Sharp via bobby) [tucu] HADOOP-8466. hadoop-client POM incorrectly excludes avro. (bmahe via tucu) [todd] Merge back changelog entries for HDFS-3042 into main CHANGES.txt files [bobby] HADOOP-8460. Document proper setting of HADOOP_PID_DIR and HADOOP_SECURE_DN_PID_DIR (bobby) ------------------------------------------ [...truncated 10118 lines...] [INFO] [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ hadoop-client --- [INFO] Skipped writing classpath file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-client/target/classes/mrapp-generated-classpath'.> No changes found. [INFO] [INFO] <<< maven-source-plugin:2.1.2:jar (default) @ hadoop-client <<< [INFO] [INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-client --- [INFO] No sources in project. Archive not created. [INFO] [INFO] >>> maven-source-plugin:2.1.2:test-jar (default) @ hadoop-client >>> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-client --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ hadoop-client --- [INFO] Skipped writing classpath file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-client/target/classes/mrapp-generated-classpath'.> No changes found. [INFO] [INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-client <<< [INFO] [INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-client --- [INFO] No sources in project. Archive not created. [INFO] [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-client --- [INFO] [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-client --- [INFO] [INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-client --- [INFO] Installing <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-client/target/hadoop-client-3.0.0-SNAPSHOT.jar> to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client/3.0.0-SNAPSHOT/hadoop-client-3.0.0-SNAPSHOT.jar [INFO] Installing <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-client/pom.xml> to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client/3.0.0-SNAPSHOT/hadoop-client-3.0.0-SNAPSHOT.pom [INFO] Installing <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-client/target/hadoop-client-3.0.0-SNAPSHOT-tests.jar> to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client/3.0.0-SNAPSHOT/hadoop-client-3.0.0-SNAPSHOT-tests.jar [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop Mini-Cluster 3.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-minicluster --- [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-minicluster --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-minicluster/target/test-dir> [INFO] Executed tasks [INFO] [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ hadoop-minicluster --- [INFO] Wrote classpath file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-minicluster/target/classes/mrapp-generated-classpath'.> [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-minicluster --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-minicluster --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-minicluster --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ hadoop-minicluster --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.12:test (default-test) @ hadoop-minicluster --- [INFO] Tests are skipped. [INFO] [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-minicluster --- [INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-minicluster/target/hadoop-minicluster-3.0.0-SNAPSHOT.jar> [INFO] [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-minicluster --- [INFO] [INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-minicluster --- [INFO] Installing <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-minicluster/target/hadoop-minicluster-3.0.0-SNAPSHOT.jar> to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-minicluster/3.0.0-SNAPSHOT/hadoop-minicluster-3.0.0-SNAPSHOT.jar [INFO] Installing <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-minicluster/pom.xml> to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-minicluster/3.0.0-SNAPSHOT/hadoop-minicluster-3.0.0-SNAPSHOT.pom [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [1.093s] [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.668s] [INFO] Apache Hadoop Annotations ......................... SUCCESS [1.239s] [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.376s] [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.160s] [INFO] Apache Hadoop Auth ................................ SUCCESS [2.169s] [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.958s] [INFO] Apache Hadoop Common .............................. SUCCESS [21.586s] [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.027s] [INFO] Apache Hadoop HDFS ................................ SUCCESS [28.141s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [5.072s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1.655s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.036s] [INFO] hadoop-yarn ....................................... SUCCESS [0.100s] [INFO] hadoop-yarn-api ................................... SUCCESS [7.001s] [INFO] hadoop-yarn-common ................................ SUCCESS [10.538s] [INFO] hadoop-yarn-server ................................ SUCCESS [0.071s] [INFO] hadoop-yarn-server-common ......................... SUCCESS [2.995s] [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [6.844s] [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.293s] [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [7.500s] [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.906s] [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.054s] [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [6.961s] [INFO] hadoop-yarn-applications .......................... SUCCESS [0.057s] [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [1.611s] [INFO] hadoop-yarn-site .................................. SUCCESS [0.117s] [INFO] hadoop-mapreduce-client-common .................... SUCCESS [6.156s] [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [1.414s] [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [7.163s] [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [3.902s] [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [6.889s] [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [2.686s] [INFO] hadoop-mapreduce .................................. SUCCESS [0.064s] [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [4.865s] [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [2.901s] [INFO] Apache Hadoop Archives ............................ SUCCESS [1.987s] [INFO] Apache Hadoop Rumen ............................... SUCCESS [3.012s] [INFO] Apache Hadoop Gridmix ............................. SUCCESS [2.978s] [INFO] Apache Hadoop Data Join ........................... SUCCESS [1.537s] [INFO] Apache Hadoop Extras .............................. SUCCESS [1.983s] [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [0.258s] [INFO] Apache Hadoop Tools ............................... SUCCESS [0.023s] [INFO] Apache Hadoop Distribution ........................ SUCCESS [0.093s] [INFO] Apache Hadoop Client .............................. SUCCESS [0.226s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.100s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2:41.846s [INFO] Finished at: Sat Jun 02 11:34:03 UTC 2012 [INFO] Final Memory: 130M/1146M [INFO] ------------------------------------------------------------------------ + cd hadoop-hdfs-project + /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle findbugs:findbugs -DskipTests -Pdist -Dtar -Pnative -Pdocs [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Apache Hadoop HDFS [INFO] Apache Hadoop HttpFS [INFO] Apache Hadoop HDFS BookKeeper Journal [INFO] Apache Hadoop HDFS Project [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS 3.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs --- [INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test-dir> [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test/data> [INFO] Executed tasks [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes> [INFO] Compiling 8 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> log4j:WARN No appenders could be found for logger (org.apache.jasper.JspC). log4j:WARN Please initialize the log4j system properly. WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.265 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 1 JSP source file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.018 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 3 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.026 [INFO] [INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @ hadoop-hdfs --- [INFO] Source directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> added. [INFO] Source directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> added. [INFO] [INFO] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-hdfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ hadoop-hdfs --- [INFO] Wrote classpath file '<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes/mrapp-generated-classpath'.> [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-hdfs --- [INFO] Compiling 435 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes> [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[32,48] com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[33,48] com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[55,4] com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[55,33] com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[59,4] com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[59,35] com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and may be removed in a future release [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs --- [INFO] Executing tasks main: [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/hdfs/WEB-INF> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/secondary/WEB-INF> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/datanode/WEB-INF> [copy] Copying 7 files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps> [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.6:run (make) @ hadoop-hdfs --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native> [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS ................................ FAILURE [20.239s] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 20.999s [INFO] Finished at: Sat Jun 02 11:34:26 UTC 2012 [INFO] Final Memory: 33M/340M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-hdfs: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native")>: java.io.IOException: error=2, No such file or directory -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException Build step 'Execute shell' marked build as failure Archiving artifacts Updating HDFS-3042 Updating HADOOP-8368 Updating HADOOP-8466 Updating MAPREDUCE-4302 Updating HDFS-3442 Updating HADOOP-8460