Build failed in Jenkins: Hadoop-Hdfs-trunk #2135

2015-05-24 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/2135/

--
[...truncated 7886 lines...]
 [exec] 2015-05-24 14:17:47,502 INFO  server.AuthenticationFilter 
(AuthenticationFilter.java:constructSecretProvider(284)) - Unable to initialize 
FileSignerSecretProvider, falling back to use random secrets.
 [exec] 2015-05-24 14:17:47,502 INFO  http.HttpRequestLog 
(HttpRequestLog.java:getRequestLog(80)) - Http request log for 
http.requests.datanode is not defined
 [exec] 2015-05-24 14:17:47,503 INFO  http.HttpServer2 
(HttpServer2.java:addGlobalFilter(678)) - Added global filter 'safety' 
(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
 [exec] 2015-05-24 14:17:47,504 INFO  http.HttpServer2 
(HttpServer2.java:addFilter(653)) - Added filter static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context datanode
 [exec] 2015-05-24 14:17:47,504 INFO  http.HttpServer2 
(HttpServer2.java:addFilter(661)) - Added filter static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context static
 [exec] 2015-05-24 14:17:47,506 INFO  http.HttpServer2 
(HttpServer2.java:openListeners(883)) - Jetty bound to port 57485
 [exec] 2015-05-24 14:17:47,506 INFO  mortbay.log (Slf4jLog.java:info(67)) 
- jetty-6.1.26
 [exec] 2015-05-24 14:17:47,557 INFO  mortbay.log (Slf4jLog.java:info(67)) 
- Started SelectChannelConnector@localhost:57485
 [exec] 2015-05-24 14:17:47,678 INFO  web.DatanodeHttpServer 
(DatanodeHttpServer.java:start(162)) - Listening HTTP traffic on 
/127.0.0.1:60590
 [exec] 2015-05-24 14:17:47,680 INFO  datanode.DataNode 
(DataNode.java:startDataNode(1144)) - dnUserName = jenkins
 [exec] 2015-05-24 14:17:47,680 INFO  datanode.DataNode 
(DataNode.java:startDataNode(1145)) - supergroup = supergroup
 [exec] 2015-05-24 14:17:47,693 INFO  ipc.CallQueueManager 
(CallQueueManager.java:init(56)) - Using callQueue class 
java.util.concurrent.LinkedBlockingQueue
 [exec] 2015-05-24 14:17:47,694 INFO  ipc.Server (Server.java:run(622)) - 
Starting Socket Reader #1 for port 48866
 [exec] 2015-05-24 14:17:47,701 INFO  datanode.DataNode 
(DataNode.java:initIpcServer(844)) - Opened IPC server at /127.0.0.1:48866
 [exec] 2015-05-24 14:17:47,713 INFO  datanode.DataNode 
(BlockPoolManager.java:refreshNamenodes(149)) - Refresh request received for 
nameservices: null
 [exec] 2015-05-24 14:17:47,715 INFO  datanode.DataNode 
(BlockPoolManager.java:doRefreshNamenodes(194)) - Starting BPOfferServices for 
nameservices: default
 [exec] 2015-05-24 14:17:47,725 INFO  datanode.DataNode 
(BPServiceActor.java:run(791)) - Block pool registering (Datanode Uuid 
unassigned) service to localhost/127.0.0.1:48928 starting to offer service
 [exec] 2015-05-24 14:17:47,732 INFO  ipc.Server (Server.java:run(852)) - 
IPC Server Responder: starting
 [exec] 2015-05-24 14:17:47,732 INFO  ipc.Server (Server.java:run(692)) - 
IPC Server listener on 48866: starting
 [exec] 2015-05-24 14:17:48,174 INFO  common.Storage 
(Storage.java:tryLock(715)) - Lock on 
https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/in_use.lock
 acquired by nodename 25...@asf904.gq1.ygridcore.net
 [exec] 2015-05-24 14:17:48,174 INFO  common.Storage 
(DataStorage.java:loadStorageDirectory(272)) - Storage directory 
https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1
 is not formatted for BP-149846206-67.195.81.148-1432477065862
 [exec] 2015-05-24 14:17:48,174 INFO  common.Storage 
(DataStorage.java:loadStorageDirectory(274)) - Formatting ...
 [exec] 2015-05-24 14:17:48,223 INFO  common.Storage 
(BlockPoolSliceStorage.java:recoverTransitionRead(241)) - Analyzing storage 
directories for bpid BP-149846206-67.195.81.148-1432477065862
 [exec] 2015-05-24 14:17:48,224 INFO  common.Storage 
(Storage.java:lock(675)) - Locking is disabled for 
https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/current/BP-149846206-67.195.81.148-1432477065862
 [exec] 2015-05-24 14:17:48,224 INFO  common.Storage 
(BlockPoolSliceStorage.java:loadStorageDirectory(158)) - Block pool storage 
directory 
https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/current/BP-149846206-67.195.81.148-1432477065862
 is not formatted for BP-149846206-67.195.81.148-1432477065862
 [exec] 2015-05-24 14:17:48,224 INFO  common.Storage 
(BlockPoolSliceStorage.java:loadStorageDirectory(160)) - Formatting ...
 [exec] 2015-05-24 14:17:48,224 INFO  common.Storage 
(BlockPoolSliceStorage.java:format(267)) - Formatting block pool 
BP-149846206-67.195.81.148-1432477065862 directory 

Hadoop-Hdfs-trunk - Build # 2135 - Still Failing

2015-05-24 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/2135/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 8079 lines...]
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ 
hadoop-hdfs-project ---
[INFO] Skipping javadoc generation
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client . SUCCESS [ 47.530 s]
[INFO] Apache Hadoop HDFS  FAILURE [  02:44 h]
[INFO] Apache Hadoop HttpFS .. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal . SKIPPED
[INFO] Apache Hadoop HDFS-NFS  SKIPPED
[INFO] Apache Hadoop HDFS Project  SUCCESS [  0.072 s]
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 02:45 h
[INFO] Finished at: 2015-05-24T14:20:12+00:00
[INFO] Final Memory: 54M/685M
[INFO] 
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project 
hadoop-hdfs: An Ant BuildException has occured: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/docs
 does not exist.
[ERROR] around Ant part ...copy 
todir=/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src...
 @ 5:121 in 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/target/antrun/build-main.xml
[ERROR] - [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn goals -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Sending artifact delta relative to Hadoop-Hdfs-trunk #2116
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 362716 bytes
Compression is 0.0%
Took 6.9 sec
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###
## FAILED TESTS (if any) 
##
All tests passed

Build failed in Jenkins: Hadoop-Hdfs-trunk-Java8 #195

2015-05-24 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/195/

--
[...truncated 8391 lines...]
 [exec] 2015-05-24 14:24:37,055 INFO  http.HttpRequestLog 
(HttpRequestLog.java:getRequestLog(80)) - Http request log for 
http.requests.datanode is not defined
 [exec] 2015-05-24 14:24:37,055 INFO  http.HttpServer2 
(HttpServer2.java:addGlobalFilter(678)) - Added global filter 'safety' 
(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
 [exec] 2015-05-24 14:24:37,055 INFO  http.HttpServer2 
(HttpServer2.java:addFilter(653)) - Added filter static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context datanode
 [exec] 2015-05-24 14:24:37,056 INFO  http.HttpServer2 
(HttpServer2.java:addFilter(661)) - Added filter static_user_filter 
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
context static
 [exec] 2015-05-24 14:24:37,057 INFO  http.HttpServer2 
(HttpServer2.java:openListeners(883)) - Jetty bound to port 48471
 [exec] 2015-05-24 14:24:37,057 INFO  mortbay.log (Slf4jLog.java:info(67)) 
- jetty-6.1.26
 [exec] 2015-05-24 14:24:37,409 INFO  mortbay.log (Slf4jLog.java:info(67)) 
- Started SelectChannelConnector@localhost:48471
 [exec] 2015-05-24 14:24:37,529 INFO  web.DatanodeHttpServer 
(DatanodeHttpServer.java:start(162)) - Listening HTTP traffic on 
/127.0.0.1:45323
 [exec] 2015-05-24 14:24:37,530 INFO  datanode.DataNode 
(DataNode.java:startDataNode(1144)) - dnUserName = jenkins
 [exec] 2015-05-24 14:24:37,530 INFO  datanode.DataNode 
(DataNode.java:startDataNode(1145)) - supergroup = supergroup
 [exec] 2015-05-24 14:24:37,545 INFO  ipc.CallQueueManager 
(CallQueueManager.java:init(56)) - Using callQueue class 
java.util.concurrent.LinkedBlockingQueue
 [exec] 2015-05-24 14:24:37,545 INFO  ipc.Server (Server.java:run(622)) - 
Starting Socket Reader #1 for port 42270
 [exec] 2015-05-24 14:24:37,552 INFO  datanode.DataNode 
(DataNode.java:initIpcServer(844)) - Opened IPC server at /127.0.0.1:42270
 [exec] 2015-05-24 14:24:37,561 INFO  datanode.DataNode 
(BlockPoolManager.java:refreshNamenodes(149)) - Refresh request received for 
nameservices: null
 [exec] 2015-05-24 14:24:37,564 INFO  datanode.DataNode 
(BlockPoolManager.java:doRefreshNamenodes(194)) - Starting BPOfferServices for 
nameservices: default
 [exec] 2015-05-24 14:24:37,574 INFO  datanode.DataNode 
(BPServiceActor.java:run(791)) - Block pool registering (Datanode Uuid 
unassigned) service to localhost/127.0.0.1:34418 starting to offer service
 [exec] 2015-05-24 14:24:37,579 INFO  ipc.Server (Server.java:run(852)) - 
IPC Server Responder: starting
 [exec] 2015-05-24 14:24:37,579 INFO  ipc.Server (Server.java:run(692)) - 
IPC Server listener on 42270: starting
 [exec] 2015-05-24 14:24:37,809 INFO  common.Storage 
(Storage.java:tryLock(715)) - Lock on 
https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/in_use.lock
 acquired by nodename 2...@asf905.gq1.ygridcore.net
 [exec] 2015-05-24 14:24:37,809 INFO  common.Storage 
(DataStorage.java:loadStorageDirectory(272)) - Storage directory 
https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1
 is not formatted for BP-2084748531-67.195.81.149-1432477475599
 [exec] 2015-05-24 14:24:37,809 INFO  common.Storage 
(DataStorage.java:loadStorageDirectory(274)) - Formatting ...
 [exec] 2015-05-24 14:24:37,835 INFO  common.Storage 
(BlockPoolSliceStorage.java:recoverTransitionRead(241)) - Analyzing storage 
directories for bpid BP-2084748531-67.195.81.149-1432477475599
 [exec] 2015-05-24 14:24:37,835 INFO  common.Storage 
(Storage.java:lock(675)) - Locking is disabled for 
https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/current/BP-2084748531-67.195.81.149-1432477475599
 [exec] 2015-05-24 14:24:37,836 INFO  common.Storage 
(BlockPoolSliceStorage.java:loadStorageDirectory(158)) - Block pool storage 
directory 
https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/current/BP-2084748531-67.195.81.149-1432477475599
 is not formatted for BP-2084748531-67.195.81.149-1432477475599
 [exec] 2015-05-24 14:24:37,836 INFO  common.Storage 
(BlockPoolSliceStorage.java:loadStorageDirectory(160)) - Formatting ...
 [exec] 2015-05-24 14:24:37,836 INFO  common.Storage 
(BlockPoolSliceStorage.java:format(267)) - Formatting block pool 
BP-2084748531-67.195.81.149-1432477475599 directory 
https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/native/build/test/data/dfs/data/data1/current/BP-2084748531-67.195.81.149-1432477475599/current
 [exec] 

Hadoop-Hdfs-trunk-Java8 - Build # 195 - Still Failing

2015-05-24 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/195/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 8584 lines...]
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ 
hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable 
package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client . SUCCESS [ 48.092 s]
[INFO] Apache Hadoop HDFS  FAILURE [  02:51 h]
[INFO] Apache Hadoop HttpFS .. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal . SKIPPED
[INFO] Apache Hadoop HDFS-NFS  SKIPPED
[INFO] Apache Hadoop HDFS Project  SUCCESS [  0.071 s]
[INFO] 
[INFO] BUILD FAILURE
[INFO] 
[INFO] Total time: 02:52 h
[INFO] Finished at: 2015-05-24T14:26:46+00:00
[INFO] Final Memory: 54M/164M
[INFO] 
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project 
hadoop-hdfs: An Ant BuildException has occured: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/src/main/docs
 does not exist.
[ERROR] around Ant part ...copy 
todir=/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/docs-src...
 @ 5:127 in 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/antrun/build-main.xml
[ERROR] - [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn goals -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Sending artifact delta relative to Hadoop-Hdfs-trunk-Java8 #175
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 797647 bytes
Compression is 0.0%
Took 11 sec
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###
## FAILED TESTS (if any) 
##
All tests passed

[jira] [Created] (HDFS-8472) Implement a new block reader that reads data over HTTP/2

2015-05-24 Thread Duo Zhang (JIRA)
Duo Zhang created HDFS-8472:
---

 Summary: Implement a new block reader that reads data over HTTP/2
 Key: HDFS-8472
 URL: https://issues.apache.org/jira/browse/HDFS-8472
 Project: Hadoop HDFS
  Issue Type: Sub-task
Reporter: Duo Zhang






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HDFS-8471) Implement read block over HTTP/2

2015-05-24 Thread Duo Zhang (JIRA)
Duo Zhang created HDFS-8471:
---

 Summary: Implement read block over HTTP/2
 Key: HDFS-8471
 URL: https://issues.apache.org/jira/browse/HDFS-8471
 Project: Hadoop HDFS
  Issue Type: Sub-task
  Components: datanode
Reporter: Duo Zhang
Assignee: Duo Zhang






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (HDFS-8470) fsimage loading progress always show 0

2015-05-24 Thread tongshiquan (JIRA)
tongshiquan created HDFS-8470:
-

 Summary: fsimage loading progress always show 0
 Key: HDFS-8470
 URL: https://issues.apache.org/jira/browse/HDFS-8470
 Project: Hadoop HDFS
  Issue Type: Bug
  Components: HDFS
Affects Versions: 2.7.0
Reporter: tongshiquan
Priority: Minor






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)