Hello W.P.,

Could you paste your exact exception/strace?

I tried to reproduce your issue and failed both on 0.20 and trunk.
Here are the stuff I used to ape yours (mostly dummy stuff):
https://gist.github.com/1210817

Perhaps am not doing some step that you are doing.

My suspicion is that your ItemSet import may be wrong? Or a different,
serializable class is getting loaded instead of the WritableComprable
one when you perform your job, due to an improper import statement and
similar names?

On Mon, Sep 12, 2011 at 12:49 AM, W.P. McNeill <[email protected]> wrote:
> With a bit more debugging I may have partially answered question (3) for
> myself.
>
> When I run a Hadoop job other than that called by "hadoop fs -text", an
> ItemSet key is created by the org.apache.hadoop.ioWritableComparator.newKey
> method, which looks like this:
>
>  public WritableComparable newKey() {
>    return ReflectionUtils.newInstance(keyClass, null);
>  }
>
> Here keyClass is equal to ItemSet.class. Compare this to the call in the
> TextRecordInputStream constructor.
>
>      key =
> ReflectionUtils.newInstance(r.getKeyClass().asSubclass(WritableComparable.class),
>                                        getConf());
>
> This does look related to
> HADOOP-4466<https://issues.apache.org/jira/browse/HADOOP-4466>,
> but that is listed as fixed in 0.19.0.
>



-- 
Harsh J

Reply via email to