Hi,
1. map.input.file in new API is contentious. It doesn't seem to be seralized in
.20 ( https://issues.apache.org/jira/browse/HADOOP-5973 ) . As of now you can
use ((FileSplit)context.getInputSplit).getPath() , there was a post on this
sometime back.
2. for your own variables in conf, please ensure you set them first and then
provide to job. For eg this works for me :
Configuration conf = new Configuration();
conf.set("myprop","i_set_this");
conf.set("mapred.job.queue.name","queue");
Job myjob = new Job(conf);
And then in setup
System.out.println("property is " +
context.getConfiguration().get("myprop","failed"));
Hope this helps
Amogh
On 1/4/10 6:52 AM, "Farhan Husain" <[email protected]> wrote:
Hello all,
I am still stuck with the following problem. Can anyone please help me?
Thanks,
Farhan
On Wed, Dec 30, 2009 at 11:21 AM, Farhan Husain
<[email protected]>wrote:
> Hello,
>
> I am using hadoop-0.20.1. I need to know the input file name in my map
> processes and pass an integer and a string to my reduce processes. I used
> the following method calls for that:
>
> config.set("tag1", "string_value");
> config.setInt("tag2", int_value);
>
> In setup method of mapper:
> String filename =
> context.getConfiguration().get("map.input.file") // returns null
>
> In setup method of reducer:
> String val =
> context.getConfiguration().get("tag1"); //
> returns null
> int n = context.getConfiguration().getInt("tag2",
> def_val); // returns def_val
>
> Can anyone please tell me what may be wrong with this code or anything
> related to it?
>
> Thanks,
> Farhan
>