[
https://issues.apache.org/jira/browse/HDFS-872?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12799580#action_12799580
]
Todd Lipcon commented on HDFS-872:
----------------------------------
I have a patch that fixes HDFS-101 and is also 0.20.1 compatible. I'm still in
the midst of running it through all the unit tests, but manual tests for
pipeline recovery look good. Will upload here when it's been vetted more
thoroughly.
> DFSClient 0.20.1 is incompatible with HDFS 0.20.2
> -------------------------------------------------
>
> Key: HDFS-872
> URL: https://issues.apache.org/jira/browse/HDFS-872
> Project: Hadoop HDFS
> Issue Type: Bug
> Affects Versions: 0.20.1, 0.20.2
> Reporter: Bassam Tabbara
> Fix For: 0.20.2
>
>
> After upgrading to that latest HDFS 0.20.2 (r896310 from
> /branches/branch-0.20), old DFS clients (0.20.1) seem to not work anymore.
> HBase uses the 0.20.1 hadoop core jars and the HBase master will no longer
> startup. Here is the exception from the HBase master log:
> {code}
> 2010-01-06 09:59:46,762 WARN org.apache.hadoop.hdfs.DFSClient: DFS Read:
> java.io.IOException: Could not obtain block: blk_338051
> 2596555557728_1002 file=/hbase/hbase.version
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1788)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1616)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1743)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1673)
> at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:320)
> at java.io.DataInputStream.readUTF(DataInputStream.java:572)
> at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:189)
> at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:208)
> at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:208)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1241)
> at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1282)
> 2010-01-06 09:59:46,763 FATAL org.apache.hadoop.hbase.master.HMaster: Not
> starting HMaster because:
> java.io.IOException: Could not obtain block: blk_3380512596555557728_1002
> file=/hbase/hbase.version
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1788)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1616)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1743)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1673)
> at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:320)
> at java.io.DataInputStream.readUTF(DataInputStream.java:572)
> at org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:189)
> at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:208)
> at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:208)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1241)
> at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1282)
> {code}
> If I switch the hadoop jars in the hbase/lib directory with 0.20.2 version it
> works well, which what led me to open this bug here and not in the HBASE
> project.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.