hi Suresh

> 
> > what i did i searched for robots.txt in /etc/wgetrc file and found 
> > that robots = on 
> 
> correctly so.  robots.txt is the web admin's way of saying "keep off" to
> webcrawlers etc.
> 
> > so what it is exactly ?????


if i can make "robots = off"  in my local m/c and get whatever i wanted
its not the proper way to protect webserver from the load of
dowanloading through tools like wget ( what for robots.txt is made ) 
so what could be foolproof method to prevent such  downloads 
along with provindg access to the pages through browsers

shubh


----------------------------------------------
An alpha version of a web based tool to manage
your subscription with this mailing list is at
http://lists.linux-india.org/cgi-bin/mj_wwwusr

Reply via email to