Hi All, When I query on larger datasets job is failing due to below issue
2014-11-03 13:40:15,279 INFO datanode.DataNode - Exception for BP-1442477155-10.28.12.10-1391025835784:blk_1096748808_1099611172702 java.io.IOException: Premature EOF from inputStream at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:194) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:446) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:702) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:711) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229) at java.lang.Thread.run(Thread.java:724) 2014-11-03 13:40:15,279 INFO datanode.DataNode - PacketResponder: BP-1442477155-10.28.12.10-1391025835784:blk_1096748808_1099611172702, type=HAS_DOWNSTREAM_IN_PIPELINE: Thread is interrupted. 2014-11-03 13:40:15,279 INFO datanode.DataNode - PacketResponder: BP-1442477155-10.28.12.10-1391025835784:blk_1096748808_1099611172702, type=HAS_DOWNSTREAM_IN_PIPELINE terminating 2014-11-03 13:40:15,279 INFO datanode.DataNode - opWriteBlock BP-1442477155-10.28.12.10-1391025835784:blk_1096748808_1099611172702 received exception java.io.IOException: Premature EOF from inputStream 2014-11-03 13:40:15,280 ERROR datanode.DataNode - task4-14.sj2.net:50010:DataXceiver error processing WRITE_BLOCK operation src: /10.29.15.23:51767 dest: / 10.29.15.23:50010 java.io.IOException: Premature EOF from inputStream at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:194) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:446) at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:702) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:711) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124) at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71) at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229) Any pointers on this type of issue. Thanks
