I assume that in most NDFS-based configurations the production search
system will not run out of NDFS. Rather, indexes will be created
offline for a deployment (i.e., merging things to create an index per
search node), then copied out of NDFS to the local filesystem on a
production search node and placed in production. This can be done
incrementally, where new indexes are deployed without re-deploying old
indexes. In this scenario, new indexes are rotated in replacing old
indexes, and the .del file for every index is updated, to reflect
deduping. There is no code yet which implements this.
Is this what you were asking?
Doug
Jay Lorenzo wrote:
I'm pretty new to nutch, but in reading through the mail lists and other
papers, I don't think I've really seen any discussion on using ndfs with
respect to automating end to end workflow for data that is going to be
searched (fetch->index->merge->search).
The few crawler designs I'm familiar with typically have spiders
(fetchers) and
indexers on the same box. Once pages are crawled and indexed the indexes
are pipelined to merge/query boxes to complete the workflow.
When I look at the nutch design and ndfs, I'm assuming the design intent
for 'pure ndfs' workflow is for the webdb to generate segments on a ndfs
partition, and once the updating of the webdb is completed, the segments
are processed 'on-disk' by the subsequent
fetcher/index/merge/query mechanisms. Is this a correct assumption?
Automating this kind of continuous workflow usually is dependent on the
implementation of some kind of control mechanism to assure that the
correct sequence of operations is performed.
Are there any recommendations on the best way to automate this
workflow when using ndfs? I've prototyped a continuous workflow system
using a traditional pipeline model with per stage work queues, and I see
how that could be applied to a clustered filesystem like ndfs, but I'm
curious to hear what the design intent or best practice is envisioned
for automating ndfs based implementations.
Thanks,
Jay
-------------------------------------------------------
SF.Net email is Sponsored by the Better Software Conference & EXPO
September 19-22, 2005 * San Francisco, CA * Development Lifecycle Practices
Agile & Plan-Driven Development * Managing Projects & Teams * Testing & QA
Security * Process Improvement & Measurement * http://www.sqe.com/bsce5sf
_______________________________________________
Nutch-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nutch-developers