Andre Majorel [EMAIL PROTECTED] writes:
And weird ones, too. These arguments are of type pointer to
fd_set. What would HPUX like to see there?
HP-UX 10 wants (int *). However it defines fd_set as
struct
{
long[];
}
so it works anyway.
HP-UX 10 is wrong. SUS2 (and
Maciej W. Rozycki [EMAIL PROTECTED] writes:
Better yet:
#if !HAVE_DECL_H_ERRNO
extern int h_errno;
#endif
and use AC_CHECK_DECLS(h_errno,,,[#include netdb.h]) somewhere in
configure.in.
My version of Autoconf does not have an AC_CHECK_DECLS macro.
Maciej W. Rozycki [EMAIL PROTECTED] writes:
On Mon, 3 Dec 2001, Hrvoje Niksic wrote:
and use AC_CHECK_DECLS(h_errno,,,[#include netdb.h]) somewhere in
configure.in.
My version of Autoconf does not have an AC_CHECK_DECLS macro.
Hmm, how about considering autoconf 2.52?
Yes
Bugfixes since 1.8-beta2. Please test it from clean compilation on
Unix (Windows and MacOS are known not to compile without modifications
when SSL is used.)
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8-beta3.tar.gz
(The `.betas' directory is intentionally
Stefan Bender [EMAIL PROTECTED] writes:
... but at home I get
gen-md5.c:31: md5.h: No such file or directory
when I try to compile the newer cvs versions.
(debian/potato, openssl installed in /usr/local)
That's weird. Apparently HAVE_SOLARIS_MD5 gets misautodetected for
you.
Do you have a
Herold Heiko [EMAIL PROTECTED] writes:
Resubmit as attachment since my previous posts wrapped. Applies to
1.8-beta3 .
Thanks. I've applied this, because it's a good stop-gap solution.
But I still have some remarks...
* We're now using lrand48 on Unix and RAND_screen on Windows. This
to the callers that are interested, so it will be usable with
binary data.
2001-12-06 Hrvoje Niksic [EMAIL PROTECTED]
* utils.c (read_whole_line): Handle lines beginning with \0.
Index: src/utils.c
===
RCS file: /pack/anoncvs
William H. Gilmore [EMAIL PROTECTED] writes:
I have recently tripped across a bug with the version of wget shipped
with RedHat 7.2. When I attempt to recursively retrieve a web tree
starting with an html link that contains a base href, wget apparently
limits all href to base href even if
, which is then require to restart va_args and call it
again. It doesn't rely on non-portable features like va_copy, and it
tries to retain readability.
If somebody has access to Linux/PPC, please try this patch and let me
know if Wget works.
2001-12-06 Hrvoje Niksic [EMAIL PROTECTED
Hrvoje Niksic [EMAIL PROTECTED]
* progress.c (progress_handle_sigwinch): Set up the signal again.
* utils.c: Include sys/termios.h, where Solaris defines
TIOCGWINSZ.
* progress.c (bar_create): Don't use the last column on the screen.
(create_image): Pad ETA
Herold Heiko [EMAIL PROTECTED] writes:
would insert `make distclean' before the second configure, or *at
least* `rm config.cache'.
Exactly my point - shouldn't either everything handled gracefully, or
cleaned up automatically when necessary (in ./configure ?),
I don't think so.
Bettucci Fabrizio [EMAIL PROTECTED] writes:
scuse me if I sent you this mail but I want to know how can possible
to create a local sitemap such as directory and file with 0 lenght
from a web site.
I don't think you can do that with Wget. The best you can do is,
download everything and
I'm seriously considering a change in how Wget translations are
distributed: I want to uncouple translations from the source. First
I'll explain what I have in mind, and then I'll give my reasons. I
would like to hear whether you think this is a good idea, and why.
What:
Uncouple the
Martin v. Loewis [EMAIL PROTECTED] writes:
Please reconsider this change. I believe it will result in no
translation being distributed at all to users for some distributors,
since they will fail to integrate the translations.
Is there a precedence for this? And even when distributions make
Martin v. Loewis [EMAIL PROTECTED] writes:
If they were, anybody with write access could integrate them
But not after the release, because after the release the strings
already change.
Are you saying you will not integrate catalogs that you receive after
the release into the CVS?
No,
2001-12-08 Hrvoje Niksic [EMAIL PROTECTED]
* wget.texi (HTTP Options): Provide more specific information
about how --load-cookies is meant to be used.
Index: wget.texi
===
RCS file: /pack/anoncvs/wget/doc
Wget 1.8 is released. It should appear on ftp.gnu.org soon; until it
does, you can get it from:
ftp://ftp.gnjilux.hr/pub/unix/util/wget/wget-1.8.tar.gz
MD5 checksum of the archive is:
000caf43722b46df1f58b6fb2deb5b58
Please send bug reports to [EMAIL PROTECTED].
I will announce the
Hrvoje Niksic [EMAIL PROTECTED] writes:
Wget 1.8 is released. It should appear on ftp.gnu.org soon; until it
does, you can get it from:
ftp://ftp.gnjilux.hr/pub/unix/util/wget/wget-1.8.tar.gz
This is a typo: the actual URL is:
ftp://ftp.srk.fer.hr/pub/unix/util/wget/wget-1.8
for the
report!
2001-12-09 Hrvoje Niksic [EMAIL PROTECTED]
* main.c (main): Remove stray debugging message.
Index: src/main.c
===
RCS file: /pack/anoncvs/wget/src/main.c,v
retrieving revision 1.66
diff -u -r1.66 main.c
--- src
Karl Eichwalder [EMAIL PROTECTED] writes:
Hrvoje Niksic [EMAIL PROTECTED] writes:
Why is unpacking two tar files instead of one a complication?
At the moment, it just wont work (gettext limitation);
Can you elaborate on this? How can gettext know how many tar files
were involved
Roger L. Beeman [EMAIL PROTECTED] writes:
I would suggest that future tarballs could have the form
wget-1.8t1.0.tar.gz
The problem here is that a new release of translation requires a
new master Wget tarball. Which invites the possibility of
introducing bugfixes, changing messages, etc.
war [EMAIL PROTECTED] writes:
gcc -I. -I. -I/app/openssl-0.9.6b/include -DHAVE_CONFIG_H
-DSYSTEM_WGETRC=\/app/wget-1.8/etc/wgetrc\
-DLOCALEDIR=\/app/wget-1.8/share/locale\ -O2 -Wall -Wno-implicit -c
gen-md5.c
In file included from gen-md5.c:31:
/usr/include/md5.h:27:
Could you send the
Erik Sigra [EMAIL PROTECTED] writes:
I have compiled the previous versions of Wget without any
problem. But version 1.8 introduced a problem; it can't find md5.h
when compiling gen-md5.c.
I have the file md5.h in /usr/local/ssl/include/openssl but the Wget
compilation seems to look in
[EMAIL PROTECTED] writes:
Today I downloaded the new wget release (1.8) (I'm a huge fan of the
util btw ;p ) and have been trying out the rate-limit feature.
[...]
assertion p - bp-buffer = bp-width failed: file progress.c,
line 673
Thanks for the report. The bug shows with downloads whose
war [EMAIL PROTECTED] writes:
This file contains any messages produced by compilers while running
configure, to aid debugging if configure makes a mistake.
This configure run looks totally hosed.
The line we're looking for is the one that attempts to detect
MD5Update in libmd5:
RaXl NXXez de Arenas Coronado [EMAIL PROTECTED] writes:
I've downloaded, compiled and installed the new wget 1.8, and I have
a problem with the new progress bar (the default one).
When downloading, at some random point, an assertion fails at
file progress.c, line 673 (function
RaXl NXXez de Arenas Coronado [EMAIL PROTECTED] writes:
'p - bp-buffer = bp-width'
Yes; thanks for the report. This patch should fix the problem:
Thanks a lot for the patch and for the speed :
You're welcome. I'm usually not that fast, but as you can imagine,
you're not the
RaXl NXXez de Arenas Coronado [EMAIL PROTECTED] writes:
I've submitted a bug report minutes ago, about the progress bar,
with a failed assertion.
Note that the assertion itself is not at fault. The assertion just
makes sure that we don't silently print over the allocated region of
memory.
Summer Breeze [EMAIL PROTECTED] writes:
I want to know if Wget is a program similar to Mozilla, and if so is
there any way to make my pages available to Wget? I use Netscape to
create my web pages.
Wget is a command-line downloading utility; it allows you to download
a page or a part of the
[EMAIL PROTECTED] writes:
But it's as documented in the man page. The option is meant for
concatenating several pages into one big file, and you can't
meaningfully compare timestamps or file sizes in that case.
Ah, so this behaviour is by design. Even so, the behaviour is
slightly
Jun-ichiro itojun Hagino [EMAIL PROTECTED] writes:
hello, a patch to support IPv6 in wget 1.7 can be found at:
ftp://ftp.kame.net/pub/kame/misc/
Thanks. I must admit that I'm a bit overwhelmed by this patch.
Several questions:
* Would you be willing to take a look at the sources
[EMAIL PROTECTED] writes:
yes, I can redo patch for the latest CVS tree, but I'd need
more info on anoncvs access.
Take a look at http://wget.sunsite.dk/. It explains how to download
the latest sources from CVS.
[EMAIL PROTECTED] writes:
looking at 1.8, i noticed that host handling needs a total
re-engineering.
I don't believe this. But I could be missing something.
we should be carrying around sockaddrs (not in_addr)
for IPv4/v6 support
The struct address_list was
Fazal Majid [EMAIL PROTECTED] writes:
I just downloaded and compiled wget 1.8 on Solaris 8 Intel MU6 with
gcc 2.95.3 (configured to use Solaris ld and GNU as) and OpenSSL
0.9.6b. Thanks for folding in the PRNG code into this release.
Excellent -- glad it works for you.
One problem I have
:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8.1-pre1.tar.gz
(The `.betas' directory is intentionally unreadable, but the file is
there.)
ChangeLog since last release:
2001-12-13 Hrvoje Niksic [EMAIL PROTECTED]
* version.c: Wget 1.8.1-pre1 is released.
2001-12-13
Pavel Stepchenko [EMAIL PROTECTED] writes:
Hello bug-wget,
$ wget --version
GNU Wget 1.8
$ wget
ftp://password:[EMAIL PROTECTED]:12345/Dir%20One/This.Is.Long.Name.Of.The.Directory/*
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
Are you using a
Pavel Stepchenko [EMAIL PROTECTED] writes:
Warning: wildcards not supported in HTTP.
Oooops! But this is FTP url, not HTTP!
HN Are you using a proxy?
Yes.
This means that HTTP is used for retrieval, and '*' won't work --
which is what Wget is trying to warn you about.
--17:26:58--
[EMAIL PROTECTED] writes:
sockaddr_in and sockaddr_in6 carries needless baggage such as the
port number and the family. There is no reason to cache all that.
to me, they are not needless, it is easier when we carry them
together.
I can't believe extracting the address would be
Zvi Har'El [EMAIL PROTECTED] writes:
Setting https_proxy to some proxy, and doing wget https://..., wget
dumps core:
Thanks for the report. This patch should remove the crash.
2001-12-13 Hrvoje Niksic [EMAIL PROTECTED]
* http.c (gethttp): Check for conn-scheme, not u-scheme
[EMAIL PROTECTED] writes:
I use a proxy server, and have a line in my .wgetrc that says
something like:
What version of Wget are you using? I believe this bug has been fixed
in Wget 1.7.1 and later.
By the way, your analysis is correct.
Sami Farin [EMAIL PROTECTED] writes:
when leeching one file, wget dumped core..
Thanks for the report. This is a known problem with the 1.8 release,
fixed by this patch:
Index: src/progress.c
===
RCS file:
[EMAIL PROTECTED] writes:
with IPv6-capable API host name resolution is done by
getaddrinfo(3) functions, not gethostbyname(3). they return
sockaddrs, therefore it is more natural/easier to carry around
sockaddrs.
It sounds like a random implementation convenience in
Mike Castle [EMAIL PROTECTED] writes:
I'm not on the list, just following via web archive.
In response to:
http://www.mail-archive.com/wget@sunsite.dk/msg02211.html
On my _homegrown_ Linux system, I'm also having an issue.
I have a libmd5, but no md5.h.
I believe this problem has
Peng GUAN [EMAIL PROTECTED] writes:
Maybe a bug in file fnmatch.c, line 54:
( n==string || (flags FNM_PATHNAME) n[-1] == '/'))
the n[-1] should be change to *(n-1).
In C n[-1] is exactly the same as *(n-1).
Hrvoje Niksic [EMAIL PROTECTED] writes:
Get it from:
ftp://gnjilux.srk.fer.hr/pub/unix/util/wget/.betas/wget-1.8.1-pre1.tar.gz
Something is wrong with that network. The prerelease is now also
available at:
http://muc.arsdigita.com:2005/wget-1.8.1-pre1.tar.gz
Volker Engels [EMAIL PROTECTED] writes:
command:
wget -o log -S -x --proxy=off ftp://ftp.nai.com/pub/antivirus/datfiles/4.x/
shows me the following log but does not download ANY file ! What's
wrong with it?
Which version of Wget are you using? Recent releases have been taught
to
Vin Shelton [EMAIL PROTECTED] writes:
I have successfully built wget-1.8 under linux-2.4.xx (at home) and
under SunOS-5.5, but under SunOS-4.1.4, I get a compile-time error:
../../../../src/wget-1.8/src/retr.c:682: `RAND_MAX' undeclared (first use in this fun
ction)
[...]
Please let me
Volker Engels [EMAIL PROTECTED] writes:
i tried out wget 16. and compiled wget 1.8. both versions will not
work can anyone have a look at this command. is it the right one to
mirror/download the files located in the ftp-folder into the local
one?
wget -o log -S -x --proxy=off
Yuriy Markiv [EMAIL PROTECTED] writes:
Please tell me where to get last version of wget.
As always, the last released version should be available at
ftp.gnu.org:/pub/gnu/wget/.
Andre Majorel [EMAIL PROTECTED] writes:
5th edition, 6th edition, 7th edition and System III all returned
0-32767. As RAND_MAX didn't exist at the time, plenty of code must
have been written that assumed 0-32767. For that reason I think it
unlikely that anybody ever wrote an implementation
Alan Eldridge [EMAIL PROTECTED] writes:
[...]
DESCRIPTION
These interfaces are obsoleted by random(3).
I wrote the random-wait patch, and it used random(3) if that was
available on the system.
Yes, but if I remember correctly, your patch disabled the feature
completely if random was
Zvi Har'El [EMAIL PROTECTED] writes:
Although wget doesn't dump core, https thru a proxy doesnot
work. Note that wget should send the proxy the http header CONNECT
to extablish a ssl tunnel. This doesn't happen, and instead it
sends GET https://...;, which is wrong!
Would CONNECT work with
-pre1:
2001-12-17 Hrvoje Niksic [EMAIL PROTECTED]
* version.c: Wget 1.8.1-pre2 is released.
2001-12-17 Hrvoje Niksic [EMAIL PROTECTED]
* retr.c (sleep_between_retrievals): Simplify indentation.
2001-12-17 Hrvoje Niksic [EMAIL PROTECTED]
* gen_sslfunc.c
Vladimir Volovich [EMAIL PROTECTED] writes:
while downloading some file (via http) with wget 1.8, i got an error:
assertion failed: p - bp-buffer = bp-width, file progress.c, line 673
Abort (core dumped)
Thanks for the report. It's a known problem in 1.8, fixed by this
patch.
Index:
Holger Klawitter [EMAIL PROTECTED] writes:
I am using wget 1.5.3 under Linux (SuSE 7.1) and I discovered that
wget fails to parse netrc files if some words contain whitespace.
Wget 1.5.3 is old. I've now tried putting a quoted password in my
`.netrc', and it works for me with the latest
Ian Abbott [EMAIL PROTECTED] writes:
Although retrieve_tree() stores and retrieves referring URLs in the
URL queue, it does not pass them to retrieve_url(). This seems to
have got lost during the transition from depth-first to breadth-
first retrieval.
It was an oversight on my part.
://somesite/index.html http://somesite/a.html
Does it? For me this command retrieves only `index.html' and
`a.html', and that's a bug. `-i list' makes no different.
For me, this patch fixes the bug in both cases:
2001-12-18 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (register_html): Maintain
Hrvoje Niksic [EMAIL PROTECTED] writes:
For me, this patch fixes the bug in both cases:
And introduces a new one. This patch is required on top of the
previous one. Or simply upgrade to the latest CVS.
2001-12-18 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Make
Alexey Aphanasyev [EMAIL PROTECTED] writes:
I got an error (see attachment) during latest CVS Wget
1.8.1-pre2+cvs compilation.
[...]
gcc -O2 -Wall -Wno-implicit -o wget cmpt.o connect.o cookies.o fnmatch.o ftp.o
ftp-basic.o ftp-ls.o ftp-opie.o getopt.o hash.o headers.o host.o html-parse.o
Alan Eldridge [EMAIL PROTECTED] writes:
There's a garbage newline output in http.c. A noticable effect of
this is when updating a directory using -N, you get a blank line for
each file that is considered for download.
I don't think that's a garbage newline; that newline is intentional,
at
then it
correctly downloads a.html only once.
This is informative; thanks. Does this patch fix the problem:
2001-12-19 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Enqueue the canonical representation of
start_url, so that the test against dl_url_file_map works.
Index: src
Zvi Har'El [EMAIL PROTECTED] writes:
Even so, adding support for connect might be non-trivial in Wget's
hairy old HTTP code. I think it will have to wait for a cleanup of
the HTTP backend.
This is your decision, of course, but it should be understood that
right now you cannot use
is
there.)
ChangeLog since 1.8.1-pre2:
2001-12-19 Hrvoje Niksic [EMAIL PROTECTED]
* version.c: Wget 1.8.1-pre3 is released.
2001-12-19 Hrvoje Niksic [EMAIL PROTECTED]
* recur.c (retrieve_tree): Enqueue the canonical representation of
start_url, so that the test against
Thomas Reinke [EMAIL PROTECTED] writes:
We've noted in a few cases that wget can hang on connect() due to a
lack of any form of timeout management. We've made a change to the
routine connect_to_one in connect.c that will implement a
timeout mechanism on connect without the use of signals or
Mike [EMAIL PROTECTED] writes:
Ok thanks, so the full command sequence to
get all the files which have an extension of '.txt' from
http://www.domain.com/subdir1/subdir2 and place them
in my current directory is:-
wget -A *.txt -r -l l -nd http://www.domain.com/subdir1/subdir2
Vladimir Volovich [EMAIL PROTECTED] writes:
this is not strictly speaking a bug, but is an inconsistency.
when i run
wget -x http://some.host/path%20to%20file/file%20name.html
wget saves the result in some.host/path%20to%20file/file name.html
i.e. it decodes %-characters in
Alexey Aphanasyev [EMAIL PROTECTED] writes:
Something is very wrong here. Almost every single line of configure
output is cached. What version of Autoconf are you using?
autoconf-2.13
That version should work. Have you performed `make distclean' before
configuring? It sounds like some
Vladimir Volovich [EMAIL PROTECTED] writes:
Hrvoje The inconsistency is a bug. It is intended that Wget encodes
Hrvoje all the unsafe characters, both in files and directories.
Hrvoje (It is debatable whether that is a bug.) This patch makes it
Hrvoje consistent, but I will not apply it
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 19 Dec 2001, Hrvoje Niksic wrote:
But one problem with this implementation is portability -- I'm pretty sure
that some systems don't support FIONBIO.
Correct. Ancient ones it seems, I couldn't find a single modern
(eh, no don't ask me
Thomas Reinke [EMAIL PROTECTED] writes:
Again, I just never saw the point.
FWIW, as I mentioned to Hrvoje earlier off-line, it can be a reliability
issue. Without it, wget can hang and require some form of intervention
to terminate properly,
I guess I was just lucky never to encounter
Jens Rösner [EMAIL PROTECTED] writes:
I noticed that -nh (no host look-up) seems to be gone in 1.8.1.
Is that right?
That is correct.
At first I thought, Oh, you fool, it is -nH, you mixed it up
But, obviously, these are two different options.
Again, correct.
I read the news file and
Jens Rösner [EMAIL PROTECTED] writes:
I already posted this on the normal wget list, to which I am subscribed.
Problem:
-nh does not work in 1.8 latest windows binary.
By not working I mean that it is not recognized as a valid parameter.
(-nh is no-host look-up and with it on,
two domain
gfa2c [EMAIL PROTECTED] writes:
So wget 1.7.1 is inserting an error in my URL (it is dropping a
slash). How can I convince it to stop?
I'm afraid you can't.
Plerase do not tell me to put http_proxy in wgetrc:
You can also use the environment variable of the same name.
it doe snot work.
Jens Rösner [EMAIL PROTECTED] writes:
1. Is there then now a way to turn off -nh?
So that wget does not distinguish between domain names of the same
IP?
No; there is no longer a way to do that.
Or is this option irrelevant given the net's current structure?
I don't think that option was
FORSAGE [EMAIL PROTECTED] writes:
For some strange reason recent wget win32 compiles (1.6 and up) ignore -w
and -t keys in command line for me :((
ie: wget18.exe -w 60 -t 0 URL acts like wget18.exe URL,
waits and retries are left at default.
That's weird, because it seems to work for me
Wget 1.8 is released. As usual, it should appear on ftp.gnu.org after
a while; until it does, you can get it from:
ftp://ftp.srk.fer.hr/pub/unix/util/wget/wget-1.8.1.tar.gz
MD5 checksum of the archive is:
6ca8e939476e840f0ce69a3b31c13060
Please send bug reports to [EMAIL PROTECTED].
[ Please mail bug reports to [EMAIL PROTECTED], not to me directly. ]
Nuno Ponte [EMAIL PROTECTED] writes:
I get a segmentation fault when invoking:
wget -r
http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html
My Wget version is 1.7-3, the one which is
Edward Manukovsky [EMAIL PROTECTED] writes:
Excuse me, please, but I've got a question.
I cannot set retry timeout for 30 seconds by doing:
wget -w30 -T600 -c -b -t0 -S -alist.log -iurl_list
For me, Wget waits for 30 seconds between each retrieval. What
version are you using?
Jean-Edouard BABIN [EMAIL PROTECTED] writes:
I found a little bug when we download from an deleted directory:
[...]
Thanks for the report.
I wouldn't consider it a real bug. Downloading things into a deleted
directory is bound to produce all kinds of problems.
The diagnostic message could
Jiang Wei [EMAIL PROTECTED] writes:
I tried to download a whole directory in a FTP site by using `-r -np'
options, and I have go through some firewall
via http_proxy/ftp_proxy. But I failed, wget-1.8.1 only retrieved the
first indexed ftp file list and stopped working, while wget-1.5.3 can
Thomas Reinke [EMAIL PROTECTED] writes:
Neat...not sure that I really nkown enough to start digging to easily
figure out what went wrong, but it can be reproduced by running the
following:
$ wget -d -r -l 5 -t 1 -T 30 -o x.lg -p -s -P dir -Q 500
--limit-rate=256000 -R mpg,mpeg
Robin B. Lake [EMAIL PROTECTED] writes:
I'm using wget to save a tick chart of a stock index each night.
wget -nH -q -O /QoI/working/CHARTS/$myday+OEX.html
'http://bigcharts.marketwatch.com/quickchart/quickchart.asp?symb=%24OEXsid=0o_symb=%24OEXx=60y=15freq=9time=1'
[...]
What is saved
Jochen Roderburg [EMAIL PROTECTED] writes:
This release 1.81. has still the problem (bug/feature ?), that *unsafe*
characters are hex-encoded in local filenames.
Yes.
Any plans to repair this ?
For 1.9, hopefully.
Ryan Daniels [EMAIL PROTECTED] writes:
The following command line causes a Segfault on my system:
wget -spider http://www.yahoo.com
Note that the correct syntax is `--spider', and that this (currently
defunct) option does not accept arguments.
But the bug you've uncovered is real: you can
Thomas Reinke [EMAIL PROTECTED] writes:
Ok, either I've completely misread wget, or it has a problem
mirroring SSL sites. It appears that it is deciding that the
https:// scheme is something that is not to be followed.
That's a bug. Your patch is close to how it should be fixed, with two
Herold Heiko [EMAIL PROTECTED] writes:
But that's not the real issue here - why -i for input but not for others
? A consistent interface should allow something like --file-char=@
-@Rfilename -@Aotherfilename ecc., i.e. accept a filename everywhere a
option is allowed.
This is a neat idea,
Peter Gucwa @ IIS-RTP [EMAIL PROTECTED] writes:
option -k does not work in following call:
wget -k -r -l 1 http://www.softcomputer.com/cgi/jobs.cgi
What version of Wget are you using?
How exactly does it not work? What did you expect to happen, and what
happened instead?
Robin B. Lake [EMAIL PROTECTED] writes:
In a prior posting, I asked about saving an image from a Web page
instead of just saving the information necessary to re-retrieve that
image. I was advised to try -p -k --html-extension
Using wget-1.8.1-pre2, I still don't see the image data saved
Brendan Ragan [EMAIL PROTECTED] writes:
This is the problem i'm having with an older wget (1.5.3) when i
enter the url
'http://www.tranceaddict.com/cgi-bin/songout.php?id=1217-dirty_dirtymonth=dec'
it goes
Connecting to www.tranceaddict.com:80... connected!
HTTP request sent,
Ian Abbott [EMAIL PROTECTED] writes:
On 4 Jan 2002 at 12:22, Bastiaan Stougie wrote:
wget -P $LOCALDIR -m -np -nH -p --cut-dirs=2
http://host/dir1/dir2/
This works fine, except that wget does not follow all the urls. It
skips urls like:
A HREF=//host/dir1/dir2/filetext/A
Wow, I
Robin B. Lake [EMAIL PROTECTED] writes:
Someone kindly suggested the -k switch. Here's what I've done:
wget -nH -p -k -E -O OEX
'http://bigcharts.marketwatch.com/quickchart/quickchart.asp?symb=%24oexsid=0o_symb=%24oexx=33y=24'
Please note that `-O' does not work with `-p'.
Why is the
Jens Rösner [EMAIL PROTECTED] writes:
Can I use -P (Directory prefix) to save files in a user-determinded
folder on another drive under Windows?
You should be able to do that. Try `-P C:/temp/'. Wget doesn't know
anything about windows backslashes, so maybe that's what made it fail.
If it
Bastiaan Stougie [EMAIL PROTECTED] writes:
Executing rpm -ta --clean wget-1.8.1.tgz gives an error, after some searching I
discovered this
is because the version in util/wget.spec is incorrect: Version: 1.7 should be:
Version: 1.8.1.
Furthermore, executing rpm -Fvh wget-1.8.1-1.i686.rpm
John Levon [EMAIL PROTECTED] writes:
moz wget-1.7 188 wget http://www.movementarian.org/oprofile-0.0.8.tar.gz
--20:35:51-- http://www.movementarian.org/oprofile-0.0.8.tar.gz
= `oprofile-0.0.8.tar.gz'
Connecting to www.movementarian.org:80... connected!
HTTP request sent,
Ivan Buttinoni [EMAIL PROTECTED] writes:
- for recursive retrieval, multiple simultaneus gets
This is very hard to do, not easy at all.
- last but not the least: javascrip support (eheheh)
And this is even harder. Javascript is a full programming language
which, as used by the sites,
Jens Rösner [EMAIL PROTECTED] writes:
Can I use -P (Directory prefix) to save files in a user-determinded
folder on another drive under Windows?
You should be able to do that. Try `-P C:/temp/'. Wget doesn't know
anything about windows backslashes, so maybe that's what made it fail.
Fred Holmes [EMAIL PROTECTED] writes:
Is there a syntax such that I can connect to the host once, transfer
the four files, and then disconnect?
Unfortunately, no, not yet.
LWS MAY be
removed without changing the semantics of the field value. Any LWS
that occurs between field-content MAY be replaced with a single SP
before interpreting the field value or forwarding the message
downstream.
Ok, how about this patch:
2002-01-14 Hrvoje Niksic [EMAIL
Brent Morgan [EMAIL PROTECTED] writes:
But I have a problem. I upgraded to 1.8.1 for win9x. I found the
cookie file for netscape 4 and 6 which are different from one
another. I made sure that each had the correct cookie set for the
website in question. I tried both and got the same error
Brent Morgan [EMAIL PROTECTED] writes:
The -d debug option crashes wget just after it reads the input file.
Huh? Ouch! Wget on Windows is much less stable than I imagined. Can
you run it under a debugger and see what causes the crash?
201 - 300 of 1457 matches
Mail list logo