Assuming you are using toolrunner, your conf param should come before your 
application specific params. 

Sent from my iPad

On Aug 15, 2010, at 12:55 PM, Jake Vang <[email protected]> wrote:

> i seem to be having problems submitting jobs from hadoop using cygwin in 
> windows (windows 7) to a hadoop multi-node cluster (ubuntu).
> 
> in windows/cygwin, i have created a user called hadoop. this user, hadoop, is 
> also the user that is created on the master/slave nodes as well. furthermore, 
> i have set dfs.permissions to false.
> 
> i have an xml file called, hadoop-cluster.xml with the following name-value 
> pairs.
> * fs.default.name=hdfs://hadoop-0:54310
> * mapred.job.tracker=hadoop-0:54311
> 
> when i run hadoop (using cygwin) i try a command like this:
> bin/hadoop my-lib.jar /user/hadoop/input-dir /user/hadoop/output-dir -conf 
> conf/hadoop-cluster.xml
> 
> but it always fails. the command never passes in the input and output 
> arguments to the program and it exits. why is this? when i upload this jar 
> file to a data node (more precisely, the master node, name node), and run 
> this command then it works (of course, then i don't need to pass in the -conf 
> param).
> 
> any help is appreciated.

Reply via email to