at the moment i start index manualy, but i have to setup a job, so that index will work every week or so.
but i first have to find a good and easy way to add new urls to the database/url list to process...
so "someday" this will work nearly automatic... with less interaction....
thanxs for your help
Markus Rietzler
* kommunikation & online service
* RZF NRW
* Tel: 0211.4572-130
-----Urspr�ngliche Nachricht-----
Von: Kir Kolyshkin [mailto:[EMAIL PROTECTED]]
Gesendet am: Dienstag, 15. Januar 2002 13:25
An: [EMAIL PROTECTED]
Betreff: Re: AW: [aseek-users] several questions to index
Alexander F Avdonkin wrote:
>
> [EMAIL PROTECTED] �����(�):
>
> >
> >
> > when i call index -D deltas are merged with my database. now the
> > exiciting question:
> >
> > i had to stop indexing, so not all pages were new indexed. at about
> > 20% have been processed so far.
> > when i now run index -D what happens to the other - not yet indexed -
> > 80%?
> > when will old paged be deleted...?
>
> During "index -D" nothing hapens with pages, which are not indexed. They
> will be possibly processed at then next indexing session.
So that means you can _safely_ stop index, do index -D, and start index again
at any time. Actually we use cron jobs that does that weekly (or be-weekly).
index -E is usable in this.
--
[EMAIL PROTECTED] ICQ 7551596 Phone +7 903 6722750
Hard work may not kill you, but why take chances?
--
