Sorry I missed this email.
Harsh answer is apt. Please see the error log from Job Tracker web ui for
failed tasks (mapper/reducer) to know the exact reason.

On Tue, Feb 28, 2012 at 10:23 AM, Mohit Anchlia <mohitanch...@gmail.com>wrote:

> Does it matter if reducer is set even if the no of reducers is 0? Is there
> a way to get more clear reason?
>
> On Mon, Feb 27, 2012 at 8:23 PM, Subir S <subir.sasiku...@gmail.com>
> wrote:
>
> > On Tue, Feb 28, 2012 at 4:30 AM, Mohit Anchlia <mohitanch...@gmail.com
> > >wrote:
> >
> > > For some reason I am getting invocation exception and I don't see any
> > more
> > > details other than this exception:
> > >
> > > My job is configured as:
> > >
> > >
> > > JobConf conf = *new* JobConf(FormMLProcessor.*class*);
> > >
> > > conf.addResource("hdfs-site.xml");
> > >
> > > conf.addResource("core-site.xml");
> > >
> > > conf.addResource("mapred-site.xml");
> > >
> > > conf.set("mapred.reduce.tasks", "0");
> > >
> > > conf.setJobName("mlprocessor");
> > >
> > > DistributedCache.*addFileToClassPath*(*new*
> Path("/jars/analytics.jar"),
> > > conf);
> > >
> > > DistributedCache.*addFileToClassPath*(*new* Path("/jars/common.jar"),
> > > conf);
> > >
> > > conf.setOutputKeyClass(Text.*class*);
> > >
> > > conf.setOutputValueClass(Text.*class*);
> > >
> > > conf.setMapperClass(Map.*class*);
> > >
> > > conf.setCombinerClass(Reduce.*class*);
> > >
> > > conf.setReducerClass(IdentityReducer.*class*);
> > >
> >
> > Why would you set the Reducer when the number of reducers is set to zero.
> > Not sure if this is the real cause.
> >
> >
> > >
> > > conf.setInputFormat(SequenceFileAsTextInputFormat.*class*);
> > >
> > > conf.setOutputFormat(TextOutputFormat.*class*);
> > >
> > > FileInputFormat.*setInputPaths*(conf, *new* Path(args[0]));
> > >
> > > FileOutputFormat.*setOutputPath*(conf, *new* Path(args[1]));
> > >
> > > JobClient.*runJob*(conf);
> > >
> > > ---------
> > > *
> > >
> > > java.lang.RuntimeException*: Error in configuring object
> > >
> > > at org.apache.hadoop.util.ReflectionUtils.setJobConf(*
> > > ReflectionUtils.java:93*)
> > >
> > > at
> > >
> org.apache.hadoop.util.ReflectionUtils.setConf(*ReflectionUtils.java:64*)
> > >
> > > at org.apache.hadoop.util.ReflectionUtils.newInstance(*
> > > ReflectionUtils.java:117*)
> > >
> > > at org.apache.hadoop.mapred.MapTask.runOldMapper(*MapTask.java:387*)
> > >
> > > at org.apache.hadoop.mapred.MapTask.run(*MapTask.java:325*)
> > >
> > > at org.apache.hadoop.mapred.Child$4.run(*Child.java:270*)
> > >
> > > at java.security.AccessController.doPrivileged(*Native Method*)
> > >
> > > at javax.security.auth.Subject.doAs(*Subject.java:396*)
> > >
> > > at org.apache.hadoop.security.UserGroupInformation.doAs(*
> > > UserGroupInformation.java:1157*)
> > >
> > > at org.apache.hadoop.mapred.Child.main(*Child.java:264*)
> > >
> > > Caused by: *java.lang.reflect.InvocationTargetException
> > > *
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(*Native Method*)
> > >
> > > at sun.reflect.NativeMethodAccessorImpl.invoke(*
> > > NativeMethodAccessorImpl.java:39*)
> > >
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.jav
> > >
> >
>

Reply via email to