could anyone recommend a compatible Hadoop version for HBase 0.94 ? Should I also upgrade zookeeper (3.3.2) ?
Thanks. On Tue, May 29, 2012 at 5:10 PM, Amit Sela <[email protected]> wrote: > I'm not sure Hadoop 0.20.3 is compatible with HBase 0.94 but I can't find > any documentation about it > > > On Tue, May 29, 2012 at 4:40 PM, Marcos Ortiz <[email protected]> wrote: > >> Are you sure that 0.94 is compatible with Hadoop 0.20.3? >> >> >> >> On 05/29/2012 09:13 AM, Amit Sela wrote: >> >>> Hi all, >>> >>> I just upgraded from HBase 0.90.2 to 0.94 (running on hadoop 0.20.3). >>> >>> It seems like the cluster is up and running. >>> >>> I tried running an old MR job that writes into HBase, and after Map is >>> complete (map 100%) and before Reduce begins (reduce 0%) - I got the >>> following Exception: >>> >>> 12/05/29 12:32:02 INFO mapred.JobClient: Task Id : >>> attempt_201205291226_0001_r_**000000_0, Status : FAILED >>> Error: java.lang.**ClassNotFoundException: com.google.protobuf.Message >>> at java.net.URLClassLoader$1.run(**URLClassLoader.java:202) >>> at java.security.**AccessController.doPrivileged(**Native Method) >>> at java.net.URLClassLoader.**findClass(URLClassLoader.java:**190) >>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**306) >>> at sun.misc.Launcher$**AppClassLoader.loadClass(**Launcher.java:301) >>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**247) >>> at >>> org.apache.hadoop.hbase.io.**HbaseObjectWritable.<clinit>(** >>> HbaseObjectWritable.java:263) >>> at org.apache.hadoop.hbase.ipc.**Invocation.write(Invocation.**java:138) >>> at >>> org.apache.hadoop.hbase.ipc.**HBaseClient$Connection.** >>> sendParam(HBaseClient.java:**537) >>> at org.apache.hadoop.hbase.ipc.**HBaseClient.call(HBaseClient.** >>> java:898) >>> at >>> org.apache.hadoop.hbase.ipc.**WritableRpcEngine$Invoker.** >>> invoke(WritableRpcEngine.java:**150) >>> at $Proxy2.getProtocolVersion(**Unknown Source) >>> at >>> org.apache.hadoop.hbase.ipc.**WritableRpcEngine.getProxy(** >>> WritableRpcEngine.java:183) >>> at org.apache.hadoop.hbase.ipc.**HBaseRPC.getProxy(HBaseRPC.**java:303) >>> at org.apache.hadoop.hbase.ipc.**HBaseRPC.getProxy(HBaseRPC.**java:280) >>> at org.apache.hadoop.hbase.ipc.**HBaseRPC.getProxy(HBaseRPC.**java:332) >>> at org.apache.hadoop.hbase.ipc.**HBaseRPC.waitForProxy(** >>> HBaseRPC.java:236) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**getHRegionConnection(** >>> HConnectionManager.java:1284) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**getHRegionConnection(** >>> HConnectionManager.java:1240) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**getHRegionConnection(** >>> HConnectionManager.java:1227) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**locateRegionInMeta(** >>> HConnectionManager.java:936) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**locateRegion(**HConnectionManager.java:832) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**locateRegion(**HConnectionManager.java:801) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**locateRegionInMeta(** >>> HConnectionManager.java:933) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**locateRegion(**HConnectionManager.java:836) >>> at >>> org.apache.hadoop.hbase.**client.HConnectionManager$** >>> HConnectionImplementation.**locateRegion(**HConnectionManager.java:801) >>> at org.apache.hadoop.hbase.**client.HTable.finishSetup(** >>> HTable.java:234) >>> at org.apache.hadoop.hbase.**client.HTable.<init>(HTable.**java:174) >>> at >>> org.apache.hadoop.hbase.**client.HTableFactory.**createHTableInterface(* >>> *HTableFactory.java:36) >>> at >>> org.apache.hadoop.hbase.**client.HTablePool.** >>> createHTable(HTablePool.java:**268) >>> at >>> org.apache.hadoop.hbase.**client.HTablePool.** >>> findOrCreateTable(HTablePool.**java:198) >>> at org.apache.hadoop.hbase.**client.HTablePool.getTable(** >>> HTablePool.java:173) >>> at >>> com.infolinks.hadoop.commons.**hbase.HBaseOperations.** >>> getTable(HBaseOperations.java:**118) >>> at com.infolinks.hadoop.**framework.HBaseReducer.setup(** >>> HBaseReducer.java:30) >>> at org.apache.hadoop.mapreduce.**Reducer.run(Reducer.java:174) >>> at org.apache.hadoop.mapred.**ReduceTask.runNewReducer(** >>> ReduceTask.java:566) >>> at org.apache.hadoop.mapred.**ReduceTask.run(ReduceTask.**java:408) >>> at org.apache.hadoop.mapred.**Child.main(Child.java:170) >>> >>> I saw something about the dependencies in HBASE-5497 and 5460 - but I >>> understand that those were fixed in 0.94... >>> >>> Any ideas ? >>> >>> Thanks, >>> Amit. >>> >>> >>> 10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS >>> INFORMATICAS... >>> CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION >>> >>> http://www.uci.cu >>> http://www.facebook.com/**universidad.uci<http://www.facebook.com/universidad.uci> >>> http://www.flickr.com/photos/**universidad_uci<http://www.flickr.com/photos/universidad_uci> >>> >> >> -- >> Marcos Luis OrtÃz Valmaseda >> Data Engineer&& Sr. System Administrator at UCI >> http://marcosluis2186.**posterous.com<http://marcosluis2186.posterous.com> >> >> http://www.linkedin.com/in/**marcosluis2186<http://www.linkedin.com/in/marcosluis2186> >> Twitter: @marcosluis2186 >> >> >> 10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS >> INFORMATICAS... >> CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION >> >> http://www.uci.cu >> http://www.facebook.com/**universidad.uci<http://www.facebook.com/universidad.uci> >> http://www.flickr.com/photos/**universidad_uci<http://www.flickr.com/photos/universidad_uci> >> > >
