Hi.
I'm experiencing some mayor difficulties trying to exclude some 6000 urls
to userpages for my local domain.
The urls vary so I generate a list before every dig which I then include
into the conf I am using. First off I tried putting each url on a different
line so the conf file looked like this :
exclude_urls: cgi-bin
exclude_urls: ~foo
exclude_urls: ~bar
..
exclude_urls: ~jon
Since the dig using this conf only used the last exclude_urls: I put
everything on one line making it look like this:
exclude_urls: cgi-bin ~foo ~bar ... ~jon
A dig that normally took 4 hours is now still going after 14 hours. A
request to /cgi-bin/htsearch is now hogging the system for a long time and
not returning anything to the user. I tried removing all the files undir
db/ but the /cgi-bin/htsearch never reaches that far as to report an error
about the db not being there.
Since everything pointed to problems with the conf I switched it out for an
old one that did not have all these excludes and everything is back to
normal .. at least the /cgi-bin/htsearch is not locking up and hogging the
system.
So I guess my questions are : Can I exclude these 6000 user urls ? If so,
how ?
Thanks.
--------------------------------
B. Stefansson
Computing Services
University of Iceland
[EMAIL PROTECTED]
------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED]
You will receive a message to confirm this.