Yup that makes sense. But when I tried opening that file using-

hadoop fs -text
/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2

I can see my file contents there? Then what's wrong with that file? And is
there any way I can fix that error in that file using some script?




On Mon, Aug 6, 2012 at 5:27 PM, Bejoy KS <bejoy...@yahoo.com> wrote:

> **
>
> It could be like the file corresponding to the partition dt='20120731' got
> corrupted.
>
> This file as pointed in the error logs should be the culprit.
>
> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>
>
>
>
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: * Techy Teck <comptechge...@gmail.com>
> *Date: *Mon, 6 Aug 2012 14:53:57 -0700
> *To: *<user@hive.apache.org>
> *ReplyTo: * user@hive.apache.org
> *Subject: *Caused by: java.io.EOFException
>
> I am writing a simple query on our hive table and I am getting some
> exception-
>
> select count(*) from table1 where dt='20120731';
>
>
>
> java.io.IOException: IO error in map input file
> hdfs://ares-nn/apps/hdmi-technology/b_apdpds/real-time_new/20120731/PDS_HADOOP_REALTIME_EXPORT-part-3-2
>
>         at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:220)
>
>         at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:197)
>
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
>
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:403)
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:337)
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:242)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:236)
>
> *Caused by: java.io.EOFException*
>
>         at java.io.DataInputStream.readFully(DataInputStream.java:180)
>
>         at
> org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
>
>         at
> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
>
>         at
> org.apache.hadoop.io.SequenceFile$Reader.readBuffer(SequenceFile.java:1646)
>
>         at
> org.apache.hadoop.io.SequenceFile$Reader.seekToCurrentValue(SequenceFile.java:1712)
>
>         at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1787)
>
>         at
> org.apache.hadoop.mapred.SequenceFileRecordReader.getCurrentValue(SequenceFileRecordReader.java:103)
>
>         at
> org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:78)
>
>         at
> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:67)
>
>         at
> org.apache.hadoop.hive.ql.io.HiveRecordReader.next(HiveRecordReader.java:33)
>
>         at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215)
>
>         ... 9 more
>
>
>
>
> Can anyone suggest me what does *Caused by: java.io.EOFException *means
> here? And when I ran the same query for different date (dt), then it works
> fine.
>

Reply via email to