Awesome! I was going to recommend checking out the code last night so that
you could put some logging statements in there. You've probably noticed
this already but the MapWritable does not have static type parameters so it
dumps out the fully qualified class name so that it can instantiate it back
using readFields() when it's deserializing.

That error is happening when the reflection is occurring- though it doesn't
make much sense. The Accumulo mapreduce packages are obviously on the
classpath. If you are still having this issue, I'll keep looking more into
this as well.


On Aug 23, 2014 2:37 PM, "JavaHokie" <soozandjohny...@gmail.com> wrote:

> I checked out 1.6.0 from git and updated the exception handling for the
> getInputTableConfigs method, rebuilt, and tested my M/R jobs that use
> Accumulo as a source or sink just to ensure everything is still working
> correctly.
>
> I then updated the InputConfigurator.getInputTableConfig exception handling
> and I see the root cause is as follows:
>
> java.io.IOException: can't find class:
> org.apache.accumulo.core.client.mapreduce.InputTableConfig because
> org.apache.accumulo.core.client.mapreduce.InputTableConfig
>         at
>
> org.apache.hadoop.io.AbstractMapWritable.readFields(AbstractMapWritable.java:212)
>         at
> org.apache.hadoop.io.MapWritable.readFields(MapWritable.java:169)
>         at
>
> org.apache.accumulo.core.client.mapreduce.lib.impl.InputConfigurator.getInputTableConfigs(InputConfigurator.java:563)
>         at
>
> org.apache.accumulo.core.client.mapreduce.lib.impl.InputConfigurator.validateOptions(InputConfigurator.java:644)
>         at
>
> org.apache.accumulo.core.client.mapreduce.AbstractInputFormat.validateOptions(AbstractInputFormat.java:342)
>         at
>
> org.apache.accumulo.core.client.mapreduce.AbstractInputFormat.getSplits(AbstractInputFormat.java:537)
>         at
>
> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:491)
>         at
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:508)
>         at
>
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
>
> The IOException can't find class <classname> because <classname> is a new
> one for me, but at least I have something specific to research.
>
> --John
>
>
>
>
> --
> View this message in context:
> http://apache-accumulo.1065345.n5.nabble.com/AccumuloMultiTableInputFormat-IllegalStateException-tp11186p11202.html
> Sent from the Users mailing list archive at Nabble.com.
>

Reply via email to