It looks like adding this line causes invocation exception. I looked in
hdfs and I see that file in that path

DistributedCache.*addFileToClassPath*(*new* Path("/jars/common.jar"), conf);

I have similar code for another jar
"DistributedCache.*addFileToClassPath*(*new* Path("/jars/analytics.jar"),
conf);" but this works just fine.


On Tue, Feb 28, 2012 at 11:44 AM, Mohit Anchlia <mohitanch...@gmail.com>wrote:

> I commented reducer and combiner both and still I see the same exception.
> Could it be because I have 2 jars being added?
>
>  On Mon, Feb 27, 2012 at 8:23 PM, Subir S <subir.sasiku...@gmail.com>wrote:
>
>> On Tue, Feb 28, 2012 at 4:30 AM, Mohit Anchlia <mohitanch...@gmail.com
>> >wrote:
>>
>> > For some reason I am getting invocation exception and I don't see any
>> more
>> > details other than this exception:
>> >
>> > My job is configured as:
>> >
>> >
>> > JobConf conf = *new* JobConf(FormMLProcessor.*class*);
>> >
>> > conf.addResource("hdfs-site.xml");
>> >
>> > conf.addResource("core-site.xml");
>> >
>> > conf.addResource("mapred-site.xml");
>> >
>> > conf.set("mapred.reduce.tasks", "0");
>> >
>> > conf.setJobName("mlprocessor");
>> >
>> > DistributedCache.*addFileToClassPath*(*new* Path("/jars/analytics.jar"),
>> > conf);
>> >
>> > DistributedCache.*addFileToClassPath*(*new* Path("/jars/common.jar"),
>> > conf);
>> >
>> > conf.setOutputKeyClass(Text.*class*);
>> >
>> > conf.setOutputValueClass(Text.*class*);
>> >
>> > conf.setMapperClass(Map.*class*);
>> >
>> > conf.setCombinerClass(Reduce.*class*);
>> >
>> > conf.setReducerClass(IdentityReducer.*class*);
>> >
>>
>> Why would you set the Reducer when the number of reducers is set to zero.
>> Not sure if this is the real cause.
>>
>>
>> >
>> > conf.setInputFormat(SequenceFileAsTextInputFormat.*class*);
>> >
>> > conf.setOutputFormat(TextOutputFormat.*class*);
>> >
>> > FileInputFormat.*setInputPaths*(conf, *new* Path(args[0]));
>> >
>> > FileOutputFormat.*setOutputPath*(conf, *new* Path(args[1]));
>> >
>> > JobClient.*runJob*(conf);
>> >
>> > ---------
>> > *
>> >
>> > java.lang.RuntimeException*: Error in configuring object
>> >
>> > at org.apache.hadoop.util.ReflectionUtils.setJobConf(*
>> > ReflectionUtils.java:93*)
>> >
>> > at
>> >
>> org.apache.hadoop.util.ReflectionUtils.setConf(*ReflectionUtils.java:64*)
>> >
>> > at org.apache.hadoop.util.ReflectionUtils.newInstance(*
>> > ReflectionUtils.java:117*)
>> >
>> > at org.apache.hadoop.mapred.MapTask.runOldMapper(*MapTask.java:387*)
>> >
>> > at org.apache.hadoop.mapred.MapTask.run(*MapTask.java:325*)
>> >
>> > at org.apache.hadoop.mapred.Child$4.run(*Child.java:270*)
>> >
>> > at java.security.AccessController.doPrivileged(*Native Method*)
>> >
>> > at javax.security.auth.Subject.doAs(*Subject.java:396*)
>> >
>> > at org.apache.hadoop.security.UserGroupInformation.doAs(*
>> > UserGroupInformation.java:1157*)
>> >
>> > at org.apache.hadoop.mapred.Child.main(*Child.java:264*)
>> >
>> > Caused by: *java.lang.reflect.InvocationTargetException
>> > *
>> >
>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(*Native Method*)
>> >
>> > at sun.reflect.NativeMethodAccessorImpl.invoke(*
>> > NativeMethodAccessorImpl.java:39*)
>> >
>> > at
>> >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.jav
>> >
>>
>
>

Reply via email to