Hmm, yeah, I tried that, but this is what happens (maybe this is the expected behavior):

I type:

./htdig -vv -m ../conf/start.url

 

and then this shows up:

New server: one.domain.com, 80

New server: two.domain.com, 80

.. and the list goes on for every domain I have already crawled, even though I only have 6 urls in the start.url file. Is that the expected result? Once it lists all those servers, it does seem to start indexing them (I tailed the logfile for the server the files are on and it's definitely hitting a lot of them, I'm assuming all of them).

 

Rog.

 

 

 

 

>
>Well, you certainly don't want to use -i because that removes the existing
>database and starts over from scratch. Gabriele, your suggestion had
>been the standard advice we gave for this in the past. However, if you're
>running 3.1.6 or a recent 3.2.0b4 snapshot, you can use the -m option to
>htdig to do "minimal" digging. You need to provide a file name argument
>after the -m, and that file must contain a list of URLs to be indexed.
>htdig will index only those URLs, using a hop count of 0 so it doesn't
>follow links. That would be the quickest way to update the index.
>
>In 3.1.6, you must run htmerge after running htdig, which will still
>take a while for a sizable index, but overall it's still faster than
>doing a full update run.
 


Join the world�s largest e-mail service with MSN Hotmail. Click Here
_______________________________________________ htdig-general mailing list <[EMAIL PROTECTED]> To unsubscribe, send a message to <[EMAIL PROTECTED]> with a subject of unsubscribe FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to