Markus Buchhorn [EMAIL PROTECTED] writes:
Reading back, that was itojun's proposal, and I suspect probably a
good choice, even if it seems less clean. Itojun is one of the leading
lights in IPv6 development, along with the whole WIDE group in Japan,
and heavily involved in the v6 stacks for
Daniel Stenberg [EMAIL PROTECTED] writes:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what my code extracts. By
extension, the idea was for
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what my code extracts.
Well, why extract the
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
I'd suggest that you instead pass around a 'struct hostent *' on
IPv4 only platforms
Why? The rest of the code never needs anything from `struct hostent'
except the list of addresses, and this is what
On 15 Jan 2002 at 0:27, Hrvoje Niksic wrote:
Brent Morgan [EMAIL PROTECTED] writes:
The -d debug option crashes wget just after it reads the input file.
Huh? Ouch! Wget on Windows is much less stable than I imagined. Can
you run it under a debugger and see what causes the crash?
I
Hi,
i worked on it :-)
The good thing is http works not with IPv4 and IPv6 sites if commpiled
with IPv6.
3 things have to be done now:
1. make command line siwtch to change the default 4/6
2. if IPv6 enabled and IPv4 address found make an IPv6 address from whem
( clean caching )
3. Make the
This is an initial proposal for naming the files and directories
that Wget creates, based on the URLs of the retrieved documents.
At the moment there are many complaints about Wget failing to save
documents which have '?' in their URLs when running under Windows,
for example. In general, the set
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
Well, why extract the addresses when you can just leave them in the
struct and pass a pointer to that?
Because I'm caching the result of the lookup, and making a deep copy of
`struct hostent' is not exactly easy. (Yes, I know libcurl does it, but
Daniel Stenberg [EMAIL PROTECTED] writes:
On Tue, 15 Jan 2002, Hrvoje Niksic wrote:
Well, why extract the addresses when you can just leave them in the
struct and pass a pointer to that?
Because I'm caching the result of the lookup, and making a deep
copy of `struct hostent' is not
Thomas Lussnig [EMAIL PROTECTED] writes:
Ok first we don't need this difference. I think it's not so easy than
it first seem's.
Because IPv6 is an superset of IPv4 there is an representation fo IPv4
Adresses.
But is it desirable to use it in preference to native IPv4 calls?
I apologize if
Rami Lehti [EMAIL PROTECTED] writes:
Wget should try to honor
Content-disposition: filename=foobar
HTTP-response header.
It is really a pain to try to download a file that is created by a script.
Usually the server gives the Content-disposition: header
You would have to save the server
Hi,
how the socket part should work fine.
inet_pton and gethostbyname2 only get used if IPV6 is defined
If IPV6 is defined and no v6 Adress is found it also use v4 Adress :-)
Now it leaves Makefile,evtl new command line an :-( ftp.
And address prnting. That can i do.
Cu Thomas
p.s. Is it now
Jonathan Davis [EMAIL PROTECTED] writes:
I recently successfully compiled and installed wget 1.8.1 on my box.
The new OS and architecture reads as follows: Mac OS X
(powerpc-apple-darwin5.2)
Thanks for the report; I've now updated MACHINES.
Boris [EMAIL PROTECTED] writes:
As propose by Hrvoje, I have try with retry option, but no change, every
time I've got 'read error'.
I also test with the new release for windows (1.8.1), but same thing
:(
I have no idea what could be going on. Perhaps a Windows person might
help? On
Dan Lavie [EMAIL PROTECTED] writes:
I have just downloaded and installed WGET on my OS-X.
You didn't say where you downloaded it from or how you installed it,
so I'll assume you're using the standard build process.
1- I canĀ¹t find any documentation.
The documentation is in Info format,
praveen sirivolu [EMAIL PROTECTED] writes:
I have a doubt.when we use wget to recursively retrieve pages from
internet its not bringing files with shtml and jhtml
extensions.is this feature not implemented or if it is there ,could
somebody explain me how to get those HTML pages.
They should
Thanks to everyone for looking at this problem. I am not a developer
and at my wits end with this problem. I did determine with a different
cookie required site that it is still not working.
I will keep my eye for future windows compilations and keep trying.
Brent Morgan
Oceaneering Space
Hello
The '%' character is valid within Win32 filenames. The '*' and '?' are not
valid filename characters.
The '%' and '*' are wildcard characters, which is probably why they were
excluded in previous versions.
There will always be problems mapping strings between namespaces, such as
URLs
Hi!
Once again I think this has nothing to do in the bug list, but, there you
go:
I've toyed with the idea of making a flag to allow `-p' span hosts
even when normal download doesn't.
Funny you mention this.
When I first heard about -p (1.7?) I thought exactly that it would default
to that
I compiled wget 1.8.1 from source
However, running wget www.yahoo.com for example
(or just any URL) doesn't do anything. There is no
output on the next screen.
The machine I am running it on is:
SunOS iecsv 5.7 Generic_106541-16 sun4u sparc SUNW,Ultra-Enterprise
wget 1.8 works just fine.
20 matches
Mail list logo