in http://wiki.apache.org/nutch/NutchHadoopTutorial page

at 'Distributed Searching' section
i read:

"On each of the search servers you would use the startup the distributed search 
server by using the nutch server command like this:
bin/nutch server 1234 /d01/local/crawled"

but /d01/local/crawled has been created only for the first server, how could i 
create it for all server?
if i use "bin/hadoop dfs -copyToLocal crawled /d01/local/" on every server, the 
search finds N identical results (where N is how many servers are into the 
cluster)


------------------------------------------------------
Passa a Infostrada. ADSL e Telefono senza limiti e senza canone Telecom
http://click.libero.it/infostrada25nov06


Reply via email to