Hello,

This is a kludge, but you could temporarily short-circuit the problematic
domain/hostname to localhost in your /etc/hosts file or (if on Windows) the
equivalent file (somewhere in c:\Windows\System32\ IIRC, Google should have
the answer).

/Pär



2014-11-26 1:36 GMT+01:00 Dun Peal <[email protected]>:

> I'm using wget to mirror an old website with all of its
> --page-requisites. Unfortunately, one of the domains which used to
> serve some of these requisites - is now offline. Every time wget tries
> to get anything from that domain, it blocks for 10+ seconds until the
> connection finally times out.
>
> This issue is substantially exacerbated by the 20 retries. I want to
> keep the default retry count at 20, but I know this one particular
> domain is never going to respond.
>
> Is there a way to tell wget to ignore that domain entirely, i.e. not
> try to fetch anything from there?
>
> If not, can I at least apply different configurations to request from
> that domain, for example specify 0 retries for it?
>
> Otherwise, any other suggested solutions for this problem?
>
> Thanks, D.
>
>

Reply via email to