[ 
https://issues.apache.org/jira/browse/HBASE-21077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16587968#comment-16587968
 ] 

Hadoop QA commented on HBASE-21077:
-----------------------------------

| (/) *{color:green}+1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  0m 
20s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} hbaseanti {color} | {color:green}  0m  
0s{color} | {color:green} Patch does not have any anti-patterns. {color} |
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  
0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:green}+1{color} | {color:green} test4tests {color} | {color:green}  0m 
 0s{color} | {color:green} The patch appears to include 2 new or modified test 
files. {color} |
|| || || || {color:brown} master Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  6m 
41s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  0m 
30s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
16s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} shadedjars {color} | {color:green}  3m 
50s{color} | {color:green} branch has no errors when building our shaded 
downstream artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  0m 
33s{color} | {color:green} master passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 
21s{color} | {color:green} master passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  3m 
31s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green}  0m 
24s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green}  0m 
24s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 
13s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m 
 0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} shadedjars {color} | {color:green}  4m 
 0s{color} | {color:green} patch has no errors when building our shaded 
downstream artifacts. {color} |
| {color:green}+1{color} | {color:green} hadoopcheck {color} | {color:green}  
7m 29s{color} | {color:green} Patch does not cause any errors with Hadoop 2.7.4 
or 3.0.0. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  0m 
44s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 
14s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green} 19m  
9s{color} | {color:green} hbase-backup in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 
13s{color} | {color:green} The patch does not generate ASF License warnings. 
{color} |
| {color:black}{color} | {color:black} {color} | {color:black} 48m 50s{color} | 
{color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=17.05.0-ce Server=17.05.0-ce Image:yetus/hbase:b002b0b |
| JIRA Issue | HBASE-21077 |
| JIRA Patch URL | 
https://issues.apache.org/jira/secure/attachment/12936505/HBASE-21077-v3.patch |
| Optional Tests |  asflicense  javac  javadoc  unit  findbugs  shadedjars  
hadoopcheck  hbaseanti  checkstyle  compile  |
| uname | Linux 666c0a7d4add 4.4.0-130-generic #156-Ubuntu SMP Thu Jun 14 
08:53:28 UTC 2018 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | 
/home/jenkins/jenkins-slave/workspace/PreCommit-HBASE-Build/component/dev-support/hbase-personality.sh
 |
| git revision | master / 588ff921c1 |
| maven | version: Apache Maven 3.5.4 
(1edded0938998edf8bf061f1ceb3cfdeccf443fe; 2018-06-17T18:33:14Z) |
| Default Java | 1.8.0_181 |
| findbugs | v3.1.0-RC3 |
|  Test Results | 
https://builds.apache.org/job/PreCommit-HBASE-Build/14120/testReport/ |
| Max. process+thread count | 4391 (vs. ulimit of 10000) |
| modules | C: hbase-backup U: hbase-backup |
| Console output | 
https://builds.apache.org/job/PreCommit-HBASE-Build/14120/console |
| Powered by | Apache Yetus 0.7.0   http://yetus.apache.org |


This message was automatically generated.



> MR job launched by hbase incremental backup command failed with 
> FileNotFoundException
> -------------------------------------------------------------------------------------
>
>                 Key: HBASE-21077
>                 URL: https://issues.apache.org/jira/browse/HBASE-21077
>             Project: HBase
>          Issue Type: Bug
>            Reporter: Vladimir Rodionov
>            Assignee: Vladimir Rodionov
>            Priority: Major
>         Attachments: HBASE-21077-v1.patch, HBASE-21077-v2.patch, 
> HBASE-21077-v3.patch
>
>
> Discovered during internal testing by Romil Choksi.
> MR job launched by hbase incremental backup command failed with 
> FileNotFoundException
> from test console log
> {code}
> 2018-06-12 04:27:31,160|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:31,160 INFO 
>  [main] mapreduce.JobSubmitter: Submitting tokens for job: 
> job_1528766389356_0044
> 2018-06-12 04:27:31,186|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:31,184 INFO 
>  [main] mapreduce.JobSubmitter: Executing with tokens: [Kind: 
> HDFS_DELEGATION_TOKEN, Service: ha-hdfs:ns1, Ident: (token for hbase: 
> HDFS_DELEGATION_TOKEN [email protected], renewer=yarn, realUser=, 
> issueDate=1528777648605, maxDate=1529382448605, sequenceNumber=175, 
> masterKeyId=2), Kind: kms-dt, Service: 172.27.68.203:9393, Ident: (kms-dt 
> owner=hbase, renewer=yarn, realUser=, issueDate=1528777649149, 
> maxDate=1529382449149, sequenceNumber=49, masterKeyId=2), Kind: 
> HBASE_AUTH_TOKEN, Service: bc71e347-78ff-4f95-af44-006f9b549a84, Ident: 
> (org.apache.hadoop.hbase.security.token.AuthenticationTokenIdentifier@5), 
> Kind: kms-dt, Service: 172.27.52.14:9393, Ident: (kms-dt owner=hbase, 
> renewer=yarn, realUser=, issueDate=1528777648918, maxDate=1529382448918, 
> sequenceNumber=50, masterKeyId=2)]
> 2018-06-12 04:27:31,477|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:31,477 INFO 
>  [main] conf.Configuration: found resource resource-types.xml at 
> file:/etc/hadoop/3.0.0.0-1476/0/resource-types.xml
> 2018-06-12 04:27:31,527|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:31,527 INFO 
>  [main] impl.TimelineClientImpl: Timeline service address: 
> ctr-e138-1518143905142-359429-01-000004.hwx.site:8190
> 2018-06-12 04:27:32,563|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:32,562 INFO 
>  [main] impl.YarnClientImpl: Submitted application 
> application_1528766389356_0044
> 2018-06-12 04:27:32,634|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:32,634 INFO 
>  [main] mapreduce.Job: The url to track the job: 
> https://ctr-e138-1518143905142-359429-01-000003.hwx.site:8090/proxy/application_1528766389356_0044/
> 2018-06-12 04:27:32,635|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:32,635 INFO 
>  [main] mapreduce.Job: Running job: job_1528766389356_0044
> 2018-06-12 04:27:44,807|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:44,806 INFO 
>  [main] mapreduce.Job: Job job_1528766389356_0044 running in uber mode : false
> 2018-06-12 04:27:44,809|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:44,809 INFO 
>  [main] mapreduce.Job:  map 0% reduce 0%
> 2018-06-12 04:27:54,926|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:54,925 INFO 
>  [main] mapreduce.Job:  map 5% reduce 0%
> 2018-06-12 04:27:56,950|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:56,950 INFO 
>  [main] mapreduce.Job: Task Id : attempt_1528766389356_0044_m_000002_0, 
> Status : FAILED
> 2018-06-12 04:27:56,979|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|Error: 
> java.io.FileNotFoundException: File does not exist: 
> hdfs://ns1/apps/hbase/data/oldWALs/ctr-e138-1518143905142-359429-01-000004.hwx.site%2C16020%2C1528776085205.1528776160915
> 2018-06-12 04:27:56,979|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1583)
> 2018-06-12 04:27:56,980|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1576)
> 2018-06-12 04:27:56,980|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> 2018-06-12 04:27:56,980|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1591)
> 2018-06-12 04:27:56,980|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.regionserver.wal.ReaderBase.init(ReaderBase.java:64)
> 2018-06-12 04:27:56,981|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.init(ProtobufLogReader.java:165)
> 2018-06-12 04:27:56,981|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:289)
> 2018-06-12 04:27:56,981|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:271)
> 2018-06-12 04:27:56,981|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:259)
> 2018-06-12 04:27:56,981|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:395)
> 2018-06-12 04:27:56,982|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.openReader(AbstractFSWALProvider.java:449)
> 2018-06-12 04:27:56,982|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.mapreduce.WALInputFormat$WALRecordReader.openReader(WALInputFormat.java:166)
> 2018-06-12 04:27:56,982|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.hbase.mapreduce.WALInputFormat$WALRecordReader.initialize(WALInputFormat.java:158)
> 2018-06-12 04:27:56,983|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
> 2018-06-12 04:27:56,983|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
> 2018-06-12 04:27:56,983|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
> 2018-06-12 04:27:56,983|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
> 2018-06-12 04:27:56,983|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> java.security.AccessController.doPrivileged(Native Method)
> 2018-06-12 04:27:56,984|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> javax.security.auth.Subject.doAs(Subject.java:422)
> 2018-06-12 04:27:56,984|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1686)
> 2018-06-12 04:27:56,984|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|at 
> org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
> 2018-06-12 04:27:56,984|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|
> {code}
> But looks like it did find the file on hdfs, test script runs incremental 
> backup command as HBase user.
> {code}
> 2018-06-12 04:27:30,756|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:30,755 
> DEBUG [main] mapreduce.WALInputFormat: Scanning 
> hdfs://ns1/apps/hbase/data/oldWALs/ctr-e138-1518143905142-359429-01-000004.hwx.site%2C16020%2C1528776085205.1528776160915
>  for WAL files
> 2018-06-12 04:27:30,758|INFO|MainThread|machine.py:167 - 
> run()||GUID=cb1d85c9-023c-4bc5-bf87-9c231c917eab|2018-06-12 04:27:30,758 INFO 
>  [main] mapreduce.WALInputFormat: Found: 
> HdfsLocatedFileStatus{path=hdfs://ns1/apps/hbase/data/oldWALs/ctr-e138-1518143905142-359429-01-000004.hwx.site%2C16020%2C1528776085205.1528776160915;
>  isDirectory=false; length=18031; replication=3; blocksize=268435456; 
> modification_time=1528776689363; access_time=1528776160921; owner=hbase; 
> group=hdfs; permission=rwx--x--x; isSymlink=false; hasAcl=false; 
> isEncrypted=false; isErasureCoded=false}
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to