According to Geoff Hutchison:
> On Thu, 10 Jun 1999, Bill Moninger wrote:
> > bogus dns entry (say 'alien.xyz') and run htdig on that.  But this requires
> > dedicating a computer, and may have other problems, such as acceptability
> > to our systems staff.
> 
> There isn't a great way to do this, unless it's possible to group the URLs
> into restricted and unrestricted. For those who might complain that there
> should be an elegant solution, remember that the client (i.e. htdig) has
> no way of knowing which pages it received because it was in the right
> domain!
> 
> If you can group the URLs, even slightly, then you can use restrict and
> exclude to set the pages you want.

The restrict and exclude input parameters to htsearch can be easily
overridden by the user.  If you're worried about users seeing restricted
data in excerpts of search results, it would be safer to build separate
restricted and unrestricted databases.  Use a config file known only
to authorised users for accessing the restricted database, and keep the
search form that accesses that config in a restricted area.  In this case,
you'd use the limit_urls_to and exclude_urls attributes to tell htdig
which files to dig in each of the restricted and unrestricted databases.

-- 
Gilles R. Detillieux              E-mail: <[EMAIL PROTECTED]>
Spinal Cord Research Centre       WWW:    http://www.scrc.umanitoba.ca/~grdetil
Dept. Physiology, U. of Manitoba  Phone:  (204)789-3766
Winnipeg, MB  R3E 3J7  (Canada)   Fax:    (204)789-3930
------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED] containing the single word "unsubscribe" in
the SUBJECT of the message.

Reply via email to