Thanks guys, I just figured it out. Looking closer in the master logs I saw that the classpath was including every jar in my maven repository from $HOME/.m2/*. I have no idea why this is happening though, since I haven't messed with any properties on environment variables that would cause this to happen. Very strange. I moved that directory out of the way and the master started fine.
I can debug how the classpath gets set, but any ideas? On Mon, Jan 10, 2011 at 10:29 PM, Stack <[email protected]> wrote: > For sure you removed the old hadoop from hbase/lib? > > On Mon, Jan 10, 2011 at 10:12 PM, Bill Graham <[email protected]> wrote: >> Thanks for the quick reply Todd. I did that before I first tried >> starting HBase, but I'm still seeing the issues. Any other >> suggestions? >> >> On Mon, Jan 10, 2011 at 10:00 PM, Todd Lipcon <[email protected]> wrote: >>> Hi Bill, >>> You simply need to replace the hadoop "core" jar in your HBase lib/ >>> directory with the same version from your HDFS install. >>> -Todd >>> >>> On Mon, Jan 10, 2011 at 9:45 PM, Bill Graham <[email protected]> wrote: >>>> >>>> Hi, >>>> >>>> Today I upgraded from Hadoop 0.20.1 to CHD3b2 0.20.2 to get the append >>>> functionality that HBase requires and now I can't start HBase. Hadoop >>>> and HDFS seem to be working just fine, but when I start up the HBase >>>> master, I get this error in the NNs: >>>> >>>> 2011-01-10 21:20:36,134 ERROR >>>> org.apache.hadoop.hdfs.server.datanode.DataNode: >>>> DatanodeRegistration(*.*.*.*:50010, >>>> storageID=DS-662249796-*.*.*.*-50010-1279065963700, infoPort=50075, >>>> ipcPort=50020):DataXceiver >>>> java.io.IOException: Version Mismatch >>>> at >>>> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:95) >>>> >>>> And this on the HBase master: >>>> >>>> 2011-01-10 21:20:39,139 FATAL org.apache.hadoop.hbase.master.HMaster: >>>> Unhandled exception. Starting shutdown. >>>> java.io.IOException: Could not obtain block: >>>> blk_4116902588460384179_696504 file=/hbase-app/hbase/hbase.version >>>> at >>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1864) >>>> at >>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1690) >>>> at >>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1819) >>>> at >>>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1747) >>>> at >>>> java.io.DataInputStream.readUnsignedShort(DataInputStream.java:320) >>>> at java.io.DataInputStream.readUTF(DataInputStream.java:572) >>>> at >>>> org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:152) >>>> at >>>> org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:173) >>>> at >>>> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:226) >>>> at >>>> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:104) >>>> at >>>> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:89) >>>> at >>>> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:338) >>>> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:274) >>>> >>>> Any ideas where to look? I'm kinda at a loss here, since I'm fairly >>>> certain the versions are all in sync. I'm able to browse HDFS in the >>>> UI and copyToLocal the /hbase-app/hbase/hbase.version file without >>>> problems. >>>> >>>> thanks, >>>> Bill >>> >>> >>> >>> -- >>> Todd Lipcon >>> Software Engineer, Cloudera >>> >> >
