shubhendu rearranged electrons thusly:

> if i can make "robots = off"  in my local m/c and get whatever i wanted
> its not the proper way to protect webserver from the load of

not protection - this is a "please keep off" courtesy request :)
for protection you'd use robots.txt - or keep it in some other directory and
use rsync or password protected ftp to access it.

        -s

> dowanloading through tools like wget ( what for robots.txt is made ) 
> so what could be foolproof method to prevent such  downloads 
> along with provindg access to the pages through browsers
 
-- 
Suresh Ramasubramanian  <-->  mallet <at> efn <dot> org
EMail Sturmbannfuhrer, Lower Middle Class Unix Sysadmin


----------------------------------------------
The mailing list archives are available at
http://lists.linux-india.org/cgi-bin/wilma/linux-india-help

Reply via email to