thank you for your reply so quickly.we cdh 5.store files directly in hbase , not path. i have read some hbase schema design , says that it is recommended that for large file store path in hbase , and put real content in hdfs sequencefile. but i think 20M is not to big. i download those log now , and send you later. where can i find HBASE-11339 HBase MOB?
Date: Tue, 16 Sep 2014 19:34:20 -0700 Subject: Re: hbase use error From: [email protected] To: [email protected] CC: [email protected] Which hadoop release are you using ?Can you pastebin more of the server logs ? bq. load file larger than 20M Do you store such file(s) directly on hdfs and put its path in hbase ?See HBASE-11339 HBase MOB On Tue, Sep 16, 2014 at 7:29 PM, QiXiangming <[email protected]> wrote: hello ,everyone i use hbase to store small pic or files , and meet an exception raised from hdfs, as following : slave2:50010:DataXceiver error processing WRITE_BLOCK operation src: /192.168.20.246:33162 dest: /192.168.20.247:50010 java.io.IOException: Premature EOF from inputStream at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:194) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:446) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:702) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:711) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229) at java.lang.Thread.run(Thread.java:745) when hbase stores pics or file under 200k, it works well, but if you load file larger than 20M , hbase definitely down! what's wrong with it ? can anyone help use? URGENT!!! Qi Xiangming
