you could use -cacheFile or -file option for this. Check streaming doc  
for examples.





On Mar 6, 2008, at 2:32 PM, "Theodore Van Rooy" <[EMAIL PROTECTED]>  
wrote:

> I would like to convert a perl script that currently uses argument  
> variables
> to run with Hadoop Streaming.
>
> Normally I would use the script like
>
> 'cat datafile.txt | myscript.pl  folder/myfile1.txt  folder/ 
> myfile2.txt'
>
> where the two argument variables are actually the names of  
> configuration
> files for the myscript.pl.
>
> The question I have is, how do I get the perl script to either look  
> in the
> local directory for the config files, or how would I go about  
> getting them
> to look on the DFS for the config files? Once the configurations are  
> passed
> in there is no problem using the STDIN to process the datafile  
> passed into
> it by hadoop.

Reply via email to