Hello,

What do I wrong?

I have setup two systems to setup a MapReduce Nutch Crawler but the jobs run always on the same "localhost" when I take a view at http://server:7845

seeds/urls is an url list with 500.000 urls

# put seed directory in ndfs
bin/nutch ndfs -put seeds seeds

# crawl a bit
bin/nutch crawl seeds -depth 10

I had mapred.map.tasks on 2 and on 4 and 8
I had mapred.reduce.tasks on 1 2 and 3

But always I have the same rsult

Name    Host    # running tasks Secs since heartbeat
tracker_29414   srv34   0       1
tracker_36968   srv21   1       2

srv21 is the master

Thanks,

Paul






Reply via email to