I am a little confused. In the Nutch wiki there are chapters on
clustering. I have never tried them though. So what is clustering
about? Is it running the crawler on multiple nodes and creating
crawldb on multiple nodes? And then finally merging all these on a
local system and running the Nutch web-gui from that?


On Dec 16, 2007 10:17 AM, Dennis Kubes <[EMAIL PROTECTED]> wrote:
> Technically you can.  The speed for most search applications would be
> unacceptable.  Searching of indexes is best done on local files systems
> for speed.
>
> Dennis Kubes
>
>
> hzhong wrote:
> > Hello,
> >
> > Why can't  we search on the Hadoop DFS?
> >
> > Thanks
>

Reply via email to