On Sat, 18 Jul 2009, Karl Vogel wrote:
> Date: Sat, 18 Jul 2009 19:34:24 -0400 (EDT)
> From: Karl Vogel
> To: freebsd-questions@freebsd.org
> Subject: Re: OT: wget bug
>
> >> On Sat, 18 Jul 2009 09:41:00 -0700 (PDT),
> >> "Joe R. Jah" said:
>
>
On Sat, 18 Jul 2009, Andrew Brampton wrote:
> Date: Sat, 18 Jul 2009 18:09:54 +0100
> From: Andrew Brampton
> To: Joe R. Jah
> Cc: freebsd-questions@freebsd.org
> Subject: Re: OT: wget bug
>
> 2009/7/18 Joe R. Jah :
> > Thank you Andrew. Yes the server is truly ret
>> On Sat, 18 Jul 2009 09:41:00 -0700 (PDT),
>> "Joe R. Jah" said:
J> Do you know of any workaround in wget, or an alternative tool to ONLY
J> download newer files by http?
"curl" can help for things like this. For example, if you're getting
just a few files, fetch only the header and ch
2009/7/18 Joe R. Jah :
> Thank you Andrew. Yes the server is truly returning 401. I have already
> reconfigured wget to download everything regardless of their timestamp,
> but it's a waste of bandwidth, because most of the site is unchanged.
>
> Do you know of any workaround in wget, or an alter
On Sat, 18 Jul 2009, Andrew Brampton wrote:
> Date: Sat, 18 Jul 2009 12:52:07 +0100
> From: Andrew Brampton
> To: Joe R. Jah
> Cc: freebsd-questions@freebsd.org
> Subject: Re: OT: wget bug
>
> 2009/7/17 Joe R. Jah :
> >
> > Hello all,
> >
> > I want
2009/7/17 Joe R. Jah :
>
> Hello all,
>
> I want to wget a site at regular intervals and only get the updated pages,
> so I use the this wget command line:
>
> wget -b -m -nH http://host.domain/Directory/file.html
>
> It works fine on the first try, but it fails on subsequent tries with the
> follo
On Fri, 17 Jul 2009, John Nielsen wrote:
> Date: Fri, 17 Jul 2009 18:52:46 -0400
> From: John Nielsen
> To: freebsd-questions@freebsd.org
> Cc: Joe R. Jah
> Subject: Re: OT: wget bug
>
> On Friday 17 July 2009 06:12:33 pm Joe R. Jah wrote:
> > I want to wget a site at
On Friday 17 July 2009 06:12:33 pm Joe R. Jah wrote:
> I want to wget a site at regular intervals and only get the updated
> pages, so I use the this wget command line:
>
> wget -b -m -nH http://host.domain/Directory/file.html
>
> It works fine on the first try, but it fails on subsequent tries wit