Hi, thanks for your help. I tried the above mentioned script(one mentioned by Raghu), but whenever i execute it, following message gets displayed, *datanode running as process <process_id>. Stop it first*. I am starting the single node cluster by command bin/start-dfs.sh first, after which i am executing the above mentioned script to start second datanode.
I also tried giving seperate changed configuration from a seperate directory for config by executing command, *bin/hadoop-daemons.sh --config <config-directory-path> start datanode* Still it gives same message as above. also in this thread before Ramya mentioned about DataNodeCluster.java. This will help, but I am not getting how to execute this class. Can you please help regarding this. thanks, -Ajit. On Thu, Feb 26, 2009 at 6:43 PM, Raghu Angadi <[email protected]> wrote: > > You can run with a small shell script. You need to override couple of > environment and config variables. > > something like : > > run_datanode () { > DN=$2 > HADOOP_LOG_DIR=logs$DN > HADOOP_PID_DIR=$HADOOP_LOG_DIR > bin/hadoop-daemon.sh $1 datanode \ > -Dhadoop.tmp.dir=/some/dir/dfs$DN \ > -Ddfs.datanode.address=0.0.0.0:5001$DN \ > -Ddfs.datanode.http.address=0.0.0.0:5008$DN \ > -Ddfs.datanode.ipc.address=0.0.0.0:5002$DN > } > > You can start second datanode like : run_datanode start 2 > > Pretty useful for testing. > > Raghu. > > > Ajit Ratnaparkhi wrote: > >> Raghu, >> >> Can you please tell me how to run multiple datanodes on one machine. >> >> thanks, >> -Ajit. >> >> On Thu, Feb 26, 2009 at 9:23 AM, Pradeep Fernando <[email protected] >> >wrote: >> >> Raghu, >>> >>> I guess you are asking if it would be more convenient if one had access >>> to >>> a >>> >>>> larger cluster for development. >>>> >>> >>> exactly..... >>> >>> I have access to many machines and clusters.. but about 99% of my >>> >>>> development happens using single machine for testing. I would guess that >>>> >>> is >>> >>>> true for most of the Hadoop developers. >>>> >>> >>> well this is the answer I was looking for.... :D >>> seems to be I have enough resources to contribute to this project. >>> Thanks a lot raghu. >>> >>> regards, >>> Pradeep Fernando. >>> >>> >> >
