Anybody?  Alrighty then.. back to more debugging -:)

On Thu, May 17, 2012 at 5:06 PM, Something Something <
[email protected]> wrote:

> HBase Version:  hbase-0.90.4-cdh3u3
>
> Hadoop Version:  hadoop-0.20.2-cdh3u2
>
>
> 12/05/17 16:37:47 ERROR mapreduce.LoadIncrementalHFiles: IOException
> during splitting
> java.util.concurrent.ExecutionException: java.io.IOException: Trailer
> 'header' is wrong; does the trailer size match content?
>         at
> java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
>         at java.util.concurrent.FutureTask.get(FutureTask.java:83)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:333)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:233)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:696)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.main(LoadIncrementalHFiles.java:701)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>         at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>         at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:49)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
> Caused by: java.io.IOException: Trailer 'header' is wrong; does the
> trailer size match content?
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$FixedFileTrailer.deserialize(HFile.java:1527)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readTrailer(HFile.java:885)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.loadFileInfo(HFile.java:819)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:405)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:323)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:321)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:680)
> Exception in thread "main" java.io.IOException: Trailer 'header' is wrong;
> does the trailer size match content?
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$FixedFileTrailer.deserialize(HFile.java:1527)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.readTrailer(HFile.java:885)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Reader.loadFileInfo(HFile.java:819)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:405)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:323)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:321)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:680)
>
>
>
>
>
> On Thu, May 17, 2012 at 4:55 PM, Ted Yu <[email protected]> wrote:
>
>> Can you post the complete message ?
>>
>> What HBase version are you using ?
>>
>> On Thu, May 17, 2012 at 4:48 PM, Something Something <
>> [email protected]> wrote:
>>
>> > Hello,
>> >
>> > I keep getting this message while running the 'completebulkload'
>> process.
>> > I tried the following solutions that I came across while Googling for
>> this
>> > error:
>> >
>> > 1)  setReduceSpeculativeExecution(true)
>> >
>> > 2)  Made sure that none of the tasks are failing.
>> >
>> > 3)  The HFileOutput job runs successfully.
>> >
>> > 4)  The first 2 lines in the output from HFileOutput look like this:
>> >
>> > 2b 39 4c 39 39 2f 2b 2b 4d 57 54 37 66 32 2b 32 2a 31 2a 31 35 33 33 3437
>> > 2a 34 39 39 39       row=+9L99/++MWT7f2+2*1*153347*4999,
>> > families={(family=info,
>> >
>> >
>> keyvalues=(+9L99/++MWT7f2+2*1*153347*4999/info:frequency/9223372036854775807/Put/vlen=1)}
>> > 2b 39 4c 39 39 2f 2b 2b 4d 57 54 37 66 32 2b 32 2a 31 2a 31 35 33 33 3438
>> > 2a 34 39 39 39       row=+9L99/++MWT7f2+2*1*153348*4999,
>> > families={(family=info,
>> >
>> >
>> keyvalues=(+9L99/++MWT7f2+2*1*153348*4999/info:frequency/9223372036854775807/Put/vlen=1)}
>> >
>> >
>> > 5)  My Mapper for HFileOutput looks like this:
>> >
>> >    public static class MyMapper extends MapReduceBase implements
>> > Mapper<LongWritable, Text, ImmutableBytesWritable, Put> {
>> >
>> >        @Override
>> >        public void map(LongWritable key, Text value,
>> > OutputCollector<ImmutableBytesWritable, Put> output, Reporter reporter)
>> >                throws IOException {
>> >            String[] values = value.toString().split("\t");
>> >            String key1 = values[0];
>> >            String value1 = values[1];
>> >
>> >            ImmutableBytesWritable ibw = new
>> > ImmutableBytesWritable(key1.getBytes());
>> >            Put put = new Put(Bytes.toBytes(key1));
>> >            put.add(Bytes.toBytes("info"), Bytes.toBytes("frequency"),
>> > Bytes.toBytes(value1));
>> >            output.collect(ibw, put);
>> >        }
>> >
>> >    }
>> >
>> >
>> > Any ideas what could be wrong?  Thanks for your help.
>> >
>>
>
>

Reply via email to