On Sun, 16 Apr 2000, Geoff Hutchison wrote:
> At 11:14 AM -0700 4/16/00, Dave Lers wrote:
> >  > you're best bet is to have each config file specify a separate
> >  > database and copy over the second database to ensure it's "fresh."

> cp database-one/* database-two/
> htdig -c my_second_configfile

So the second dig is always adding one hop to the local database_one, that
works (I assume the local hops to local files/dirs that were already indexed
pose no problems*). Do I have to mess with htdig-dbgen? That file makes just
about 0 sense to me.

*How does Htdig handle those foo/?=D type auto indexes (an Apache thing?)?
Watching dig I seem to remember a long run of *'s (I ran one search script that
indexed these as separate URL's)

> >just realized most/all my
external URL's are in a flatfile db, can I do > >something like:
> >
> >limit_urls_to:  ${start_url} `/path/to/my.db`
> >(assuming they are all links to specific pages, no http://foo/ URL's)?.
> 
> Oh sure. As long as there's whitespace between each URL, that'll work 
> just fine.

Pipes on both sides, no spaces, one URL per record/line. I could create a file,
w/ just the URL's and add in the spaces, from the db...OT newbie question,
could I put that Perl script in cron?

Thanks,
Dave


------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED]
You will receive a message to confirm this.

Reply via email to