AFAIK, tasktracker will load your job archive automatically while running
the map/reduce task.

On Tue, Sep 9, 2008 at 10:28 PM, Ryan LeCompte <[EMAIL PROTECTED]> wrote:

> Based on some similar problems that I found others were having in the
> mailing lists, it looks like the solution was to list my Map/Reduce
> job JAR In the conf/hadoop-env.sh file under HADOOP_CLASSPATH. After
> doing that and re-submitting the job, it all worked fine! I guess the
> MapWritable class somehow doesn't share the same classpath as the
> program that actually submits the job conf. Is this expected?
>
> Thanks,
> Ryan
>
>
> On Tue, Sep 9, 2008 at 9:44 AM, Ryan LeCompte <[EMAIL PROTECTED]> wrote:
> > Okay, I think I'm getting closer but now I'm running into another
> problem.
> >
> > First off, I created my own CustomMapWritable that extends MapWritable
> > and invokes AbstractMapWritable.addToMap() to add my custom classes.
> > Now the map/reduce phases actually complete and the job as a whole
> > completes. However, when I try to use the SequenceFile API to later
> > read the output data, I'm getting a strange exception. First the code:
> >
> > FileSystem fileSys = FileSystem.get(conf);
> > SequenceFile.Reader reader = new SequenceFile.Reader(fileSys, inFile,
> > conf);
> > Text key = new Text();
> > CustomWritable stats = new CustomWritable();
> > reader.next(key, stats);
> > reader.close();
> >
> > And now the exception that's thrown:
> >
> > java.io.IOException: can't find class: com.test.CustomStatsWritable
> > because com.test.CustomStatsWritable
> >        at
> org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java:210)
> >        at
> org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:145)
> >        at
> com.test.CustomStatsWritable.readFields(UserStatsWritable.java:49)
> >        at
> org.apache.hadoop.io.SequenceFile$Reader.getCurrentValue(SequenceFile.java:1751)
> >        at
> org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1879)
> > ...
> >
> > Any ideas?
> >
> > Thanks,
> > Ryan
> >
> >
> > On Tue, Sep 9, 2008 at 12:36 AM, Ryan LeCompte <[EMAIL PROTECTED]>
> wrote:
> >> Hello,
> >>
> >> I'm attempting to use a SortedMapWritable with a LongWritable as the
> >> key and a custom implementation of org.apache.hadoop.io.Writable as
> >> the value. I notice that my program works fine when I use another
> >> primitive wrapper (e.g. Text) as the value, but fails with the
> >> following exception when I use my custom Writable instance:
> >>
> >> 2008-09-08 23:25:02,072 INFO org.apache.hadoop.mapred.ReduceTask:
> >> Initiating in-memory merge with 1 segments...
> >> 2008-09-08 23:25:02,077 INFO org.apache.hadoop.mapred.Merger: Merging
> >> 1 sorted segments
> >> 2008-09-08 23:25:02,077 INFO org.apache.hadoop.mapred.Merger: Down to
> >> the last merge-pass, with 1 segments left of total size: 5492 bytes
> >> 2008-09-08 23:25:02,099 WARN org.apache.hadoop.mapred.ReduceTask:
> >> attempt_200809082247_0005_r_000000_0 Merge of the inmemory files threw
> >> a
> >> n exception: java.io.IOException: Intermedate merge failed
> >>        at
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$InMemFSMergeThread.doInMemMerge(ReduceTask.java:2133)
> >>        at
> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$InMemFSMergeThread.run(ReduceTask.java:2064)
> >> Caused by: java.lang.RuntimeException: java.lang.NullPointerException
> >>        at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:80)
> >>        at
> org.apache.hadoop.io.SortedMapWritable.readFields(SortedMapWritable.java:179)
> >>        ...
> >>
> >> I noticed that the AbstractMapWritable class has a protected
> >> "addToMap(Class clazz)" method. Do I somehow need to let my
> >> SortedMapWritable instance know about my custom Writable value? I've
> >> properly implemented the custom Writable object (it just contains a
> >> few primitives, like longs and ints).
> >>
> >> Any insight is appreciated.
> >>
> >> Thanks,
> >> Ryan
> >>
> >
>



-- 

朱盛凯

Jash Zhu

复旦大学软件学院

Software School, Fudan University

Reply via email to