On Fri, 25 Mar 2005 20:35:13 -0700
Joseph <[EMAIL PROTECTED]> wrote:

> How to restrict outsider to use "wget" on my web-page? 
You can add a robots.txt and put he following in it:
User-agent: Wget
Disallow: /

This will keep wget from working on it but still allow search engins
to catalog the pages. You can specify multiple Disallow lines for
different diectories.

This isn't foolproof as someone could add a  "robots = off" to their
/etc/wgetrc or they could pass the -erobots=off option to wget to turn
off robots.txt processing. As some of the other responders pointed out,
you can't stop everyone, but this should be good enough to stop most
people from using wget on your site.


-- 
Eric Olinger

Public Key: http://pgp.mit.edu:11371/pks/lookup?op=get&search=0xF90FBBC1
Key fingerprint: B678 9E22 1161 51CF 6664 7591 6767 5BDB F90F BBC1

Give a man a password, he'll log in for a day. Teach him to code, and he
will hack his way in...

Attachment: pgp0wcVRgE7Gz.pgp
Description: PGP signature

Reply via email to