On Tuesday 25 November 2014 16:36:27 Dun Peal wrote: > I'm using wget to mirror an old website with all of its > --page-requisites. Unfortunately, one of the domains which used to > serve some of these requisites - is now offline. Every time wget tries > to get anything from that domain, it blocks for 10+ seconds until the > connection finally times out. > > This issue is substantially exacerbated by the 20 retries. I want to > keep the default retry count at 20, but I know this one particular > domain is never going to respond. > > Is there a way to tell wget to ignore that domain entirely, i.e. not > try to fetch anything from there?
--exclude-domains domain-list
Specify the domains that are not to be followed.
also, see 'man wget'
Tim
signature.asc
Description: This is a digitally signed message part.
