I think you can use job.getInt("mapred.task.partition",-1) to get the
mapper ID, which should be the same for the mapper across task reruns.

-----Original Message-----
From: Piotr Praczyk [mailto:piotr.prac...@gmail.com] 
Sent: 18 June 2009 15:19
To: core-user@hadoop.apache.org
Subject: Re: Getting Task ID inside a Mapper

Hi
Why don't you provide a seed of random generator generated outside the
task
? Then when the task fails, you can provide the same value stored
somewhere
outside.
You could use the task configuration to do so.

I don't know anything about obtaining the task ID from within.


regards
Piotr

2009/6/18 Mark Desnoyer <mdesno...@gmail.com>

> Hi,
>
> I was wondering if it's possible to get a hold of the task id inside a
> mapper? I cant' seem to find a way by trolling through the API
reference.
> I'm trying to implement a Map Reduce version of Latent Dirichlet
Allocation
> and I need to be able to initialize a random number generator in a
task
> specific way so that if the task fails and is rerun elsewhere, the
results
> are the same. Thanks in advance.
>
> Cheers,
> Mark Desnoyer
>



This message should be regarded as confidential. If you have received this 
email in error please notify the sender and destroy it immediately.
Statements of intent shall only become binding when confirmed in hard copy by 
an authorised signatory.  The contents of this email may relate to dealings 
with other companies within the Detica Group plc group of companies.

Detica Limited is registered in England under No: 1337451.

Registered offices: Surrey Research Park, Guildford, Surrey, GU2 7YP, England.


Reply via email to