The difference in timestamps is due to a bug related to the handling of
daylight savings time. It's already been fixed in the current development
version of wget in the CVS archive. If you are interested, consult
http://sunsite.dk/wget/wgetdev.html for more info.
Dominic Caffey asked:
Is there a wget option to specify which http method get or post that
wget should use?
Wget doesn't support POSTs yet (only GETs). It will soon, I hope. If you
search the wget mailing list archive, you'll find a couple patches which
implemented POSTs, but they have not
Marc Stephenson asked:
I've built wget 1.7.1 with ssl, but don't really know how to test it.
Anybody know an easy way to test that combination?
Personally, I just tried to connect to https://www.apache-ssl.org/. I picked
that site because you don't need (as far as I know) a userid or password
Eugene Lee wrote:
For some reason, Apple's GCC does not like the '' in its assert().
The problem is not with Apple's gcc, nor with the Darwin definition of the
assert() macro, it's with cpp-precomp, Apple's C pre-processor that
implements support for pre-compiled headers.
There are two
Excerpts from mail: (07-Aug-01) Range Request by Stefan Saroiu
I need to retrieve certain ranges of documents. For this, I'm using the
following wget flags:
wget --header='Range: bytes=100-' ip_address
The problem is that wget mistakes this command with a partial download and
Compiling wget on a Mac G3, OS 8.6, MachTen 4.1.1.
After converting the un-tarred files from Mac to UNIX, and running
./configure, typing make yields the following:
Did you use Stuffit Expander or some other Mac program to do your un-tar-ing?
If so, that's probably the root of your problem.
is there a way to use HTTP POST with wget ??
Unfortunately, no. Wget doesn't support POSTs yet (only GETs). It will
someday, I hope. If you search the wget mailing list archive, you'll find a
couple source code patches which have implemented the POST capability in
wget, but they have not been
Nathan J. Yoder wrote:
Please fix this soon,
wget -k http://reality.sgi.com/fxgovers_houst/yama/panels/panelsIntro.html
02:30:05 (23.54 KB/s) - `panelsIntro.html' saved [3061/3061]
Converting panelsIntro.html... zsh: segmentation fault (core dumped)
Anders Rosendal asked:
Could you make an option to only fetch from other hosts what is directly
referenced from the orig page?
Have you tried the --page-requisites (a.k.a. -p) command line option?
The info documentation says this:
Actually, to download a single page and all its
Denis Ahrens wrote:
In line 435 in html-parse.c is a non-escaped doubleqoute ().
That's perfectly valid code.
I cannot compile this file under MacOSX without escaping this char.
That's a bug in cpp-precomp, Apple's C pre-processor that implements support
for pre-compiled headers. The way to
I'd like to use wget to create a static version of my
wget -m works great, except that it creates files like
Can I tell wget to convert those links.. replacing all
? and with _ ?
You'll have to modify the source code to wget, I think.
On Mon, 24 Sep 2001, Edward J. Sabol wrote:
We've started to accumulate a fair number of patches which fix serious
problems in wget 1.7. It would be really nice to apply to them to the CVS
archive so that they don't get lost.
Daniel Stenberg replied:
Based on previous mails on this list
many thanks for your wget. I use a win32 port of the 1.7 version
(gnuwin32.sf.net/wget). I have tried to download a German version of the
Bible with wget -m http://biblewerk.de/bible and I got only the first
Mail list logo