On Fri, Dec 31, 2010 at 8:12 AM, gajendra khanna <[email protected]>wrote:
> the only potential problem is on sites which doesn't let wget download > due to robot exclusion standard which wget respects. > > -- > l...@iitd - http://tinyurl.com/ycueutm > That shouldn't be a problem. I quote from the wget manual, If you know what you are doing and really really wish to turn off the robot exclusion, set the robots variable to ‘off’ in your .wgetrc. You can achieve the same effect from the command line using the -eswitch, e.g. ‘wget -e robots=off url...’. For more info see http://www.gnu.org/software/wget/manual/wget.html#Robot-Exclusion -- Mihir Mehta, B. Tech. student, Department of Computer Science and Engineering, Indian Institute of Technology, Delhi. -- l...@iitd - http://tinyurl.com/ycueutm
