If old api, you might want to use mapred.input.format.class

Thanks
Ryota

On 3/29/13 9:19 AM, "Chris Sigman" <[email protected]> wrote:

>Thanks Ryota for the reply.  I'm not using the new API, but I tried it
>anyway just-in-case, and no luck.  Imapreduce.inputformat.class is the
>correct property to set the input format class for the old api, right?
>
>--
>Chris
>
>
>On Thu, Mar 28, 2013 at 4:48 PM, Ryota Egashira <[email protected]>
>wrote:
>> Hi Chris
>>
>> I saw case before where empty input is passed, causing "No input paths
>> specified in job".
>>
>> Also if you're trying to use new api (mapreduce.*) instead of old
>> api(mapred.*), adding following property in workflow might be needed.
>> =====
>> <property>
>>     <name>mapred.mapper.new-api</name>
>>     <value>true</value>
>>                 </property>
>>
>>                 <property>
>>   <name>mapred.reducer.new-api</name>
>>    <value>true</value>
>>                 </property>
>>
>> =====
>>
>> Thanks
>> Ryota
>>
>> On 3/28/13 8:41 AM, "Chris Sigman" <[email protected]> wrote:
>>
>>>Hi all,
>>>
>>>I've set mapreduce.inputformat.class in the configuration section of
>>>my map-reduce action, but I don't think it's getting picked up due to
>>>the following exception:
>>>
>>>java.io.IOException: No input paths specified in job
>>>at
>>>org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java
>>>:1
>>>53)
>>>at
>>>org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:
>>>20
>>>5)
>>>at 
>>>org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1026)
>>>at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1018)
>>>at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
>>>at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:929)
>>>at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:882)
>>>at java.security.AccessController.doPrivileged(Native Method)
>>>at javax.security.auth.Subject.doAs(Subject.java:415)
>>>at
>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
>>>n.
>>>java:1278)
>>>at
>>>org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:882)
>>>at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:856)
>>>at
>>>org.apache.oozie.action.hadoop.MapReduceMain.submitJob(MapReduceMain.jav
>>>a:
>>>88)
>>>at 
>>>org.apache.oozie.action.hadoop.MapReduceMain.run(MapReduceMain.java:54)
>>>at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:26)
>>>at
>>>org.apache.oozie.action.hadoop.MapReduceMain.main(MapReduceMain.java:37)
>>>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>at
>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav
>>>a:
>>>57)
>>>at
>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor
>>>Im
>>>pl.java:43)
>>>at java.lang.reflect.Method.invoke(Method.java:601)
>>>at
>>>org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:39
>>>1)
>>>at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>>>at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:391)
>>>at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>>>at org.apache.hadoop.mapred.Child$4.run(Child.java:266)
>>>at java.security.AccessController.doPrivileged(Native Method)
>>>at javax.security.auth.Subject.doAs(Subject.java:415)
>>>at
>>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio
>>>n.
>>>java:1278)
>>>at org.apache.hadoop.mapred.Child.main(Child.java:260)
>>>
>>>I've also tried setting mapreduce.job.inputformat.class per a comment
>>>I saw somewhere, but it doesn't seem to make a difference.  I'm using
>>>CDH3.5, so hadoop 0.20.2 and oozie 2.3.2.  Am I just doing something
>>>wrong?
>>>
>>>Thanks for any help,
>>>
>>>--
>>>Chris
>>

Reply via email to