Hi Sayali,

Yes, you can submit a collection of files from HDFS as input to the
job. Please take a look at the WordCount example in the Map/Reduce
tutorial for an example:

http://hadoop.apache.org/core/docs/r0.18.0/mapred_tutorial.html#Example%3A+WordCount+v1.0

Ryan


On Sat, Sep 6, 2008 at 9:03 AM, Sayali Kulkarni
<[EMAIL PROTECTED]> wrote:
> Hello,
> When starting a hadoop job, I need to specify an input file and an output 
> file. Can I instead specify a list of input files?
> example, I have the input distributed in the files:
> file000,
> file001,
> file002,
> file003,
> ...
> So I can I specify input files as file*. I can add all my files to HDFS.
>
> Thanks in advance!
> --Sayali
>
>

Reply via email to