Hey Jim

what do you mean by local?

This wiki page defines two 'local' modes.
http://wiki.apache.org/lucene-hadoop/QuickStart

One with a local cluster, and one with all jobs running standalone.

The conf dir that ships with Hadoop is setup for standalone.

I find it useful to have a copy of 'conf' named 'conf-localhost' that has the local cluster version of 'hadoop-site.xml'. then use

> ./bin/hadoop --config ./conf-localhost ....

to start her up.. (after following the other wiki steps)

cheers
ckw

On Nov 1, 2007, at 12:35 PM, Jim the Standing Bear wrote:

Hello,

I am in need of some clarifications on how to run a hadoop job locally.

The cluster was originally set up to have two nodes, where one of them
also acts as the master node and job tracker.

According to the wiki, I can run a job locally by altering
"mapred.job.tracker" and "fs.default.name" properties to "local" in
hadoop-site.xml.  But when I start the server, it stack dumped:

localhost: starting secondarynamenode, logging to /home/blahblahblah
localhost: Exception in thread "main" java.lang.RuntimeException: Not
a host:port pair: local

Apparently it didn't like the value "local"?

Also, the wiki noted that all these XML configuration files should be
included somewhere in the class path to the job, does it mean I need
to include the XMLs as I do jars?

Thank

-- Jim

Chris K Wensel
[EMAIL PROTECTED]
http://chris.wensel.net/



Reply via email to