>> the only potential problem is on sites which doesn't let wget download
>> due to robot exclusion standard which wget respects.
>>
>> --
> That shouldn't be a problem. I quote from the wget manual,
>
> If you know what you are doing and really really wish to turn off the robot
> exclusion, set the robots variable to ‘off’ in your .wgetrc. You can achieve
> the same effect from the command line using the -eswitch, e.g. ‘wget -e
> robots=off url...’.
>
I was talking about ethical issues not technical issues.
I already know about that switch. If a site doesn't wish it to be
done, shouldn't one be respecting it? Thats what I was talking about.
Gajendra

-- 
l...@iitd - http://tinyurl.com/ycueutm

Reply via email to