As I understand, the JobConf getters/setters are best for data static to the entire job.
What's the recommended way to pass a variable to the mappers/reducers that might be different for each InputSplit? For example, let's say I'm using Hadoop's grep example to extract information from a collection of logfiles, but each logfile has some header info (which may or may not be part of the current InputSplit) that every mapper/reducer needs access to. How should I approach this? Is overriding InputFileFormat so that the header data is tacked onto each InputSplit the best way to approach this? Thanks, Norbert On Dec 25, 2007 12:34 PM, Ted Dziuba <[EMAIL PROTECTED]> wrote: > You can get and set variables in the JobConf. The map task's > configure() method takes a JobConf as a parameter, and you can keep the > reference as an instance variable. > > Ted > > > helena21 wrote: > > Hi everybody, > > > > please explain me the steps to pass user parameters for the mapper class. > > thanks. > > > >
