Hi,
I accidentally tried to recursively get files from the local file systems
rather then the web. This resulted in a segmentation fault. Not the
Unsupported scheme error message I get with -r.
$ echo : /tmp/test.html
$ wget -r /tmp/test.html
Segmentation fault (core dumped)
wget version
Adam Klobukowski [EMAIL PROTECTED] writes:
Adam Klobukowski [EMAIL PROTECTED] writes:
If wget is used with --input-file option, it gets directory
listing for each file specified in input file (if ftp protocol)
before downloading each file,
This is not specific to --input-file, it
By the way, can you please clarify the intention behind AI_V4MAPPED
and AI_ALL, which configure tests for, but nothing uses?
Tony Lewis [EMAIL PROTECTED] writes:
antonio taylor wrote:
http://fisrtname lastname:[EMAIL PROTECTED]
Have you tried http://fisrtname%20lastname:[EMAIL PROTECTED] ?
Or simply quotes, as in wget http://firstname lastname:[EMAIL PROTECTED].
Interestingly, I can't repeat this. Still, to be on the safe side, I
added some additional restraints to the code that make it behave more
like the previous code, that worked. Please try again and see if it
works now. If not, please provide some form of debugging output as
well.
This
Does someone have access to a BEOS machine with a compiler? I'd like
to verify whether the current CVS works on BEOS, i.e. whether it's
still true that BEOS doesn't support MSG_PEEK.
Speaking of testing, please be sure to test the latest CVS on Windows
as well, where MSG_PEEK is said to be
On Wed, 26 Nov 2003, Hrvoje Niksic wrote:
Speaking of testing, please be sure to test the latest CVS on Windows as
well, where MSG_PEEK is said to be flaky. HTTPS is another thing that might
work strangely because SSL_peek is undocumented (!).
Out of curiosity, why are you introducing this
Daniel Stenberg [EMAIL PROTECTED] writes:
Out of curiosity, why are you introducing this peeking? I mean,
what's the gain?
Simplifying the code. Getting rid of the unfinished and undocumented
rbuf abstraction layer. Buffering is unnecessary when downloading
the body, and is mostly
Sample windows MSVC compiled and basic test performed (download of the same
site with http and https, got exactly the same files).
Binary at the usual place, unfortunately my crappy ISP webserver seems to be
in Guru Meditation just now and refuses access (not the first problem after
the recent
Peter Kohts [EMAIL PROTECTED] writes:
4) When I'm doing straight-forward wget -m -nH http://www.gnu.org;
everything is excellent, except the redirections: the files which we
get because of the redirections overwrite any currently existing
files with the same filenames.
I see your point.
Peter GILMAN [EMAIL PROTECTED] writes:
first of all, thanks for taking the time and energy to consider this
issue. i was only hoping to pick up a pointer or two; i never
realized this could turn out to be such a big deal!
Neither did we. :-)
1) Jens' observation that the user will think
11 matches
Mail list logo