On Thursday 07 April 2005 06:42 am, you wrote:
What would it take to get my VMS changes into the main code stream?
http://antinode.org/dec/sw/wget.html
hi steven,
well, as much as i really appreciate your work on VMS support for wget and
feel guilty for not having enough time to
On Monday 04 April 2005 09:48 am, Juhana Sadeharju wrote:
Hello.
The following document could not be downloaded at all:
http://www.greyc.ensicaen.fr/~dtschump/greycstoration/
If you succeed, please tell me how. I want all the html file
and the images.
could you please tell us which
Two points:
o some junks are archived. (po/*.gmo and windows/*~)
o string_t remains in src/Makefile.in (does not build)
Otherwise it looks OK.
~/cvs/wget$ diff -xCVS -ur . /tmp/wget-1.10-alpha1/
Only in /tmp/wget-1.10-alpha1/: Branches
Only in /tmp/wget-1.10-alpha1/: configure.bat
Only in
Jörn Nettingsmeier wrote:
hi everyone !
i'm trying to set up a website monitoring tool for a university research
project. the idea is to use wget to archive politician's websites once a
week to analyse their campaigns in the last 4 weeks before the election.
i have hit a few snags, and i would
On Tuesday 05 April 2005 03:16 am, FUJISHIMA Satsuki wrote:
Two points:
o some junks are archived. (po/*.gmo and windows/*~)
sorry. i am really spoiled by automake, which automatically deletes junk files
from the final distribution.
o string_t remains in src/Makefile.in (does not build)
[EMAIL PROTECTED] writes:
I found some problem to download page from https://... URL, when I
have connection only through proxy. For NON http protocols I use
CONNECT method, but wget seems to not use it and access directly
https URLS. For http:// URL wget downloads fine.
Can you tell me, is
On Tuesday 08 March 2005 05:36 pm, John R. Vanderpool wrote:
--proxy=on/off
Turn proxy support on or off. The proxy is on by default if the
appropriate environment variable is defined.
For more information about the use of proxies with Wget,
leaves ya hanging
On Sat, 05 Mar 2005 Hrvoje Niksic wrote:
-D filters the URLs encountered with -r. Specifying an input file is
the same as specifying those URLs on the command-line. If you need to
exclude domains from the input file, I guess you can use something
like `grep -v'.
Hi Hrvoje,
thanks - but
Martin Trautmann [EMAIL PROTECTED] writes:
I'm afraid that reading the URLs from an input file can't be passed
through a -D filter? What's a reasonable behavior of combining -i
and -D?
-D filters the URLs encountered with -r. Specifying an input file is
the same as specifying those URLs on
Seb_kramm,
Is it just 0-success/1-failure,
So far, for many cases it is just that.
How can the script know if the page it's trying to fetch is existent
or not ?
If wget retunred code 0, this means the page is downloaded, therefore,
it exists. :-) If it returned code 1, the page is not
Unfortunately, in some cases at least, wget return codes are nearly
meaningless. When trying to mirror an ftp site for example, the return
code appears to only indicate if the server could be contacted and has
nothing to do with the success of the requested operation.
The situation is probably
On Sunday 20 February 2005 22:47, Nol Kthe wrote:
Am Mittwoch, den 09.02.2005, 10:40 +0200 schrieb Nick Shaforostoff:
hi, i've translated wget.1 to Russian
what should i do to get it added to the official wget distro?
Just send your ru.po or a patch/diff to the po/ru.po in the wget cvs to
On Monday 21 February 2005 01:27 pm, you wrote:
On Sunday 20 February 2005 22:47, Nol Kthe wrote:
Am Mittwoch, den 09.02.2005, 10:40 +0200 schrieb Nick Shaforostoff:
hi, i've translated wget.1 to Russian
what should i do to get it added to the official wget distro?
Just send your
Mauro Tortonesi [EMAIL PROTECTED] writes:
i don't know what's the correct procedure to add a new translation
to a GNU project (hrvoje, do you have any ideas?),
I used to add translations for Croatian, both for Wget and for other
programs, so I should know, but I must admit that the details now
Am Mittwoch, den 09.02.2005, 10:40 +0200 schrieb Nick Shaforostoff:
hi, i've translated wget.1 to Russian
what should i do to get it added to the official wget distro?
Just send your ru.po or a patch/diff to the po/ru.po in the wget cvs to
[EMAIL PROTECTED]
--
Nol Kthe noel
Sorry for the Dual post Steven, just realised I hadn't sent it to the list.
On Sat, 19 Feb 2005 11:26:16 +, Jonathan Share [EMAIL PROTECTED] wrote:
On Fri, 18 Feb 2005 22:43:50 -0600 (CST), Steven M. Schweda
[EMAIL PROTECTED] wrote:
In case it might be useful, I've included the -d
On Feb 18, 2005, at 9:16 PM, Mauro Tortonesi wrote:
On Saturday 12 February 2005 10:29 am, Chris Ross wrote:
The wget web page at www.gnu.org has a link for the mailing list
that
doesn't work. So I'm emailing here.
which link? could you please tell me so that i can fix it?
Under Request an
On Friday 11 February 2005 07:26 am, Marco Colombo wrote:
yo!!!
I have just updated the italian translation (the first update was two
months ago). since the previous update didn't make its way into cvs, I
thought I would post a notice here.
you can find it here:
On Saturday 12 February 2005 10:29 am, Chris Ross wrote:
The wget web page at www.gnu.org has a link for the mailing list that
doesn't work. So I'm emailing here.
which link? could you please tell me so that i can fix it?
--
Aequam memento rebus in arduis servare mentem...
Mauro
Normand Savard wrote:
I have a question about wget. Is is possible to download other attribute
value other than the harcoded ones?
No, at least not in the existing versions of wget. I have not heard that
anyone is working on such an enhancement.
On Tue, Jan 25, 2005 at 02:44:16PM +, Andrew Robb wrote:
First of all, many thanks for a rock-solid utility!
OS: SuSE Linux 9.1:
rpm -q wget
wget-1.9.1-45
I am resuming the download of a DVD image but the sizes seem to overflow
32-bit signed integer.
wget -c
Zitat von Tony O'Hagan [EMAIL PROTECTED]:
Original path: abc def/xyz pqr.gif
After wget mirroring: abc%20def/xyz pqr.gif (broken link)
wget --version is GNU Wget 1.8.2
This was a well-known error in the 1.8 versions of wget, which is already
corrected in the 1.9
El 11/01/2005, a las 17:28, Daniel Stenberg escribió:
On Tue, 11 Jan 2005, Leonid wrote:
curl does not survive losing connection. Since the probability to
lose connection when you download 2Gb+ files is very high even if you
have a fast connection,
This mailing list is for wget, not curl. We
On Wed, 12 Jan 2005, Wincent Colaiuta wrote:
Daniel really needs to do one of two things:
Thanks for telling me what to do.
Your listing wasn't 100% accurate though. Am I not allowed to discuss
technical solutions for wget if that involves a term from a different Free
Software project I am
El 12/01/2005, a las 14:33, Daniel Stenberg escribió:
On Wed, 12 Jan 2005, Wincent Colaiuta wrote:
Daniel really needs to do one of two things:
Thanks for telling me what to do.
I was just pointing out your hypocrisy because I found it offensive.
When you told Leonid to shut up, did he write back
Alle 14:06, mercoledì 12 gennaio 2005, Wincent Colaiuta ha scritto:
El 11/01/2005, a las 17:28, Daniel Stenberg escribió:
On Tue, 11 Jan 2005, Leonid wrote:
curl does not survive losing connection. Since the probability to
lose connection when you download 2Gb+ files is very high even if
On Tue, 11 Jan 2005, Leonid wrote:
curl does not survive losing connection. Since the probability to lose
connection when you download 2Gb+ files is very high even if you have a fast
connection,
This mailing list is for wget, not curl. We can talk about what curl does and
does not on the curl
Daniel,
I apologize if I hurt your feeling about curl. Last summer I had
to download several 10Gb+ files and I tried to use curl and ncftp.
After a day or so of work curl was stopping, freezing forever and
I was unable to force it to retry and to resume. Maybe I misused curl,
did not understand
Alle 17:28, martedì 11 gennaio 2005, hai scritto:
On Tue, 11 Jan 2005, Leonid wrote:
curl does not survive losing connection. Since the probability to lose
connection when you download 2Gb+ files is very high even if you have a
fast connection,
This mailing list is for wget, not curl.
On Tue, 11 Jan 2005, Mauro Tortonesi wrote:
oh, come on. let's not fall to the my software is better than yours
childish attitude.
I'm sorry if it came out that way, it was not my intention. I just wanted to
address the misinformation posted here.
I have not said and do not think that X is
Alle 21:47, martedì 11 gennaio 2005, Daniel Stenberg ha scritto:
On Tue, 11 Jan 2005, Mauro Tortonesi wrote:
oh, come on. let's not fall to the my software is better than yours
childish attitude.
I'm sorry if it came out that way, it was not my intention. I just wanted
to address the
Alle 13:41, venerdì 31 dicembre 2004, hai scritto:
Hello, here it is the modification I've done to the code to accept
prompting to the user when --passwd=- is set.
ChangeLog Modification
2004-12-31 Pere Ramos Bosch [EMAIL PROTECTED]
* main.c: Added parameter --passwd=PASS to let
On Sun, 9 Jan 2005, Denis Doroshenko wrote:
*size = strtol (respline + 4, NULL, 0);
where size is defined as long int * in the function's declaration. BTW.
why's the base given to strtol is 0, not 10? isn't that too flexible for
a defined protocol?
Yes it is, SIZE returns a base-10 number.
The
]
Sent: Friday, December 31, 2004 2:38 PM
To: Herold Heiko
Cc: Mudliar, Anand; [EMAIL PROTECTED]; 'Mauro Tortonesi'
Subject: RE: wGET - NTLM Support
On Fri, 31 Dec 2004, Herold Heiko wrote:
Daniel, could you resend that code to the current co-maintainer Mauro
Tortonesi [EMAIL PROTECTED] ? Maybe
On Mon, 3 Jan 2005, Mudliar, Anand wrote:
I would appreciate if you please let me know as soon as the code is posted
on the site.
It would be great help if you can point me to some site/link where the old
version is posted.
Let me just point out loud and clear that my files were not complete
There
has been some discussions on that in the past.
IIRC
there is no current implementation, however please see the following messages in
the archives:
From:
Daniel Stenberg [EMAIL PROTECTED]
Subject RE: Is NTLM authentication with wget
possible?
Date:
25/05/2004 13.14
On Tue, 25 May
: Friday, December 31, 2004
2:09 PMTo: Mudliar, Anand; [EMAIL PROTECTED]; 'Daniel
Stenberg'Cc: 'Mauro Tortonesi'Subject: RE: wGET - NTLM
Support
There
has been some discussions on that in the past.
IIRC
there is no current implementation, however please see the following messages
On Fri, 31 Dec 2004, Herold Heiko wrote:
Daniel, could you resend that code to the current co-maintainer Mauro
Tortonesi [EMAIL PROTECTED] ? Maybe sooner or later he finds some
time for this.
The files I prepared are no longer around (I put them up on a site in november
2003 and I fixed the FSF
Quoting Jan Minar [EMAIL PROTECTED]:
(2) Use alternative retrieval programs, such as pavuk, axel, or
ncftpget.
FWIW pavuk is much worse securitywise than wget. I've been working on patching
pavuk for a few months, and it has lots of strcpy() and sprintf() calls that
lead to buffer overflows,
El 09/12/2004, a las 10:14, Jan Minar escribió:
(0) Wget authors are/were incompetent. Everything else is a corollary.
That's a very aggressive stance to take, and not likely to be
productive. Patches, for example, would be more productive.
-- Mauro Tortonesi in a private mail exchange with me
Alle 18:49, venerdì 3 dicembre 2004, Aaron S. Hawley ha scritto:
There are old links to Wget on the web pointing to:
http://www.gnu.org/software/wget/wget.html
The FSF people have a nice symlink system for package web sites. Simply
add a file called `.symlinks' to Wget's CVS web repository
On Wed, 1 Dec 2004 [EMAIL PROTECTED] wrote:
Is there a way to use SSL authentication with ftp in wget?
AFAIK, wget doesn't support it.
But curl does: curl.haxx.se
--
-=- Daniel Stenberg -=- http://daniel.haxx.se -=-
ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol
Its one of those days =) I'm not on the list, so when I replied to my
own post I am the only one that would get it.
-miah
- Forwarded message from miah [EMAIL PROTECTED] -
Date: Wed, 17 Nov 2004 11:51:26 -0500
From: miah [EMAIL PROTECTED]
To: miah [EMAIL PROTECTED]
Subject: Re: wget
Hi Leonid (and list!)
Thanks for the response, but there was nothing at the bottom of wget.h
that explained what the numeric error codes refer to. For example, a
common return code I get when wget fails is 256, but I do not know what
that means... I'd be happy to contribute by compiling a
Alle 14:04, mercoledì 10 novembre 2004, Sergei Smirnov ha scritto:
Nice to meet you! [EMAIL PROTECTED],
wget fall to core. core file is attached.
follow step i get core.
wget
ftp://download.fedora.redhat.com/pub/fedora/linux/core/3/i386/iso/FC3-i386-
DVD.iso lenght it file --
Robert,
Is there a list somewhere that defines the numbered error codes
returned by wget?
Look at the bottom of src/wget.h
The names are rather descriptive. More details can be found in the
source code. If you could create a descriptive table and add it to
documentation, other users will
Hi Gerriet!
Only three images, which were referenced in styles.css, were missing.
Yes, wget does not parse css or javascript.
I thought that the -p option causes Wget to download all the files
that are necessary to properly display a given HTML page. This includes
such things as inlined
On Mon, 18 Oct 2004, Gerriet M. Denkmann wrote:
So - is this a bug, did I misunderstand the documentation, did I use
the wrong options?
Reasonable request. You just couldn't find the archives:
http://www.mail-archive.com/[EMAIL PROTECTED]/msg06626.html
more:
Zitat von Graham Leggett [EMAIL PROTECTED]:
In v1.9.1 of wget, it is not possible to retrieve files from an ftp
server that requires a username and password.
Hmm, worked always fine here (anonymous and non-anonymous) with
ftp://user:[EMAIL PROTECTED]/path-to-file
Best regards,
Jochen
% wget -q -O - http://www.gnu.org/software/wget/manual/wget-1.8.1/html_mono/wget.html
| html2text | head -15
** GNU Wget **
* The noninteractive downloading utility *
* Updated for Wget 1.8.1, December 2001 *
by Hrvoje [EMAIL PROTECTED]'{c} and the developers
there is typically plenty across all platforms)
-Original Message-
From: Jim Wright [mailto:[EMAIL PROTECTED]
Sent: Friday, October 01, 2004 07:23 PM
To: Jeff Holicky
Cc: [EMAIL PROTECTED]
Subject: Re: wget operational question
% wget -q -O -
http://www.gnu.org/software/wget/manual/wget-1.8.1
Hi Helmut!
I suspect there is a robots.txt that says index, no follow
Try
wget -nc -r -l0 -p -np -erobots=off
http://www.vatican.va/archive/DEU0035/_FA.HTM;
it works for me.
-l0 says: infinite recursion depth
-p means page requisites (not really necessary)
-erobots=off orders wget to ignore any
Hi Helmut!
Try
wget -nc -r -l0 -p -np -erobots=off
http://www.vatican.va/archive/DEU0035/_FA.HTM;
it works for me.
It also works for me. Thank you,
Helmut
--
Supergünstige DSL-Tarife + WLAN-Router für 0,- EUR*
Jetzt zu GMX wechseln und sparen http://www.gmx.net/de/go/dsl
On Wed, 21 Jan 2004 23:07:30 -0800, you wrote:
Hello,
I think I've come across a little bug in wget when using it to get a file
via ftp.
I did not specify the passive option, yet it appears to have been used
anyway Here's a short transcript:
Passive FTP can be specified in /etc/wgetrc or
That's a bug in all released versions of Wget, sorry. In the next
release downloading files larger than 2G might become possible.
Eric Domenjoud [EMAIL PROTECTED] writes:
Under Mandrake linux 9.2, the command
wget -r -k http://www.cplusplus.com/ref/iostream
terminates with
get: retr.c:263: calc_rate: Assertion `msecs = 0' failed.
Aborted
This problem has been fixed in 1.9.1.
Maik,
As you could see from
http://www.mail-archive.com/wget%40sunsite.dk/msg06652.html and other
postings, this problem will be fixed in the next release of the official
version. If you need to download files larger than 2Gb now, get
patched, unofficial version of wget from
[EMAIL PROTECTED] writes:
I think I have come across a bug with wget( ), although I cannot be
100% certain. Here is a simple instance of it (assuming it is
indeed a bug):
[results in core dump]
wget -r --no-parent -A.ptt --no-directories -nv
H, sorry, I have just discovered that it has been reported about a week
ago (http://www.mail-archive.com/wget%40sunsite.dk/msg06527.html). I really
did try to search for some overwrite, etc. in the archive, honestly. :-)
But that e-mail does not use the word overwrite at all...
Regards,
Manuel [EMAIL PROTECTED] writes:
http1.1
why not?
Because implementing it seemed like additional work for little or no
gain. This is changing as more server software assumes HTTP/1.1 and
bugs out on 1.0 clients.
If it to works once, I would make a shell script running wget with the -r option to
replace the file and use cron or at to run the shell script you make in 30 minute
intervals.
It looks like you're running windows, so as long as it's NT 4.0 or better (ie non DOS
based) you'll have to use at
]
Subject: RE: wget hangs or downloads end up incomplete in Windows 2000 X
P.
FYI. I noticed if I ctrl-c to get out of the hanging part and try to
resume, my FTP seems to be broken and hangs. I tried manually with ftp.exe
command in command line and it froze with dir command
Are you behind a firewall or proxy of some kind? If so, you might want to
try using passive FTP mode.
Mark Post
-Original Message-
From: Phillip Pi [mailto:[EMAIL PROTECTED]
Sent: Thursday, May 20, 2004 3:08 PM
To: [EMAIL PROTECTED]
Subject: RE: wget hangs or downloads end up
3:08 PM
To: [EMAIL PROTECTED]
Subject: RE: wget hangs or downloads end up incomplete in Windows 2000 X
P.
FYI. I noticed if I ctrl-c to get out of the hanging part and try to
resume, my FTP seems to be broken and hangs. I tried manually with ftp.exe
command in command line and it froze
Thanks for forwarding this. The idea was for Wget to print the file
name it will write to, and yet to refrain from creating the file until
the data arrives.
One way to solve this is to use O_EXCL when opening the file, and
refusing to write to files that cannot be so opened. Essentially,
Wget
Doh, sorry for the double post, didn't realize that my email to wget
@sunsite.dk would show up on the gmane.comp.web.wget.general group.
Mike
Mike Hanby [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]
Howdy,
I've tried the following with wget 1.9.1 and 1.8.2
I have a scenario where
From: Phillip Pi [mailto:[EMAIL PROTECTED]
OK, I did more tests. I noticed -v is already enabled by
default since the
you probably have verbose=on in your wgetrc file.
5250K .. .. .. .. ..
The timestamp was from almost an hour ago (I was in a
Did you ever run the download with -v ?
What did the log say when wget seemed to hang or regarding to the missing or
corrupt files, or regarding the parsing of the directory index (or whatever
it was) linking to those files ?
If nothing usefull is logged, try again with -d (but be prepared, a huge
OK, I did more tests. I noticed -v is already enabled by default since the
results looked the same when I used -v switch. Next, I tried -d and -o
switches. The download got stuck. Here's the end of the log for the last
file being downloaded:
[snipped -- please note I change some IP addresses,
David Fritz wrote:
Hmm, you might try upgrading to a newer version of mingw (see
http://www.mingw.org/).
Thanks, I wasn't aware of the fact that there's a newer version. I
downloaded MinGW from http://gnuwin.epfl.ch/apps/mingw/en/index.html.
They offer version 2.0.0.3.
Alternatively, you
Hrvoje Niksic wrote:
This patch should fix the problem. Please let me know if it works for
you:
2004-05-08 Hrvoje Niksic [EMAIL PROTECTED]
* ftp-basic.c (ftp_pwd): Handle PWD response not containing
double quote.
Index: src/ftp-basic.c
Axel Pettinger [EMAIL PROTECTED] writes:
I added the five lines to ftp-basic.c and recompiled Wget. Now I can say
that your patch is indeed working for me![1] Thank you very much.
BTW, I was a little bit confused because of the last line in your patch.
Instead of FREE_MAYBE (*pwd); my
Axel Pettinger wrote:
Hrvoje Niksic wrote:
This patch should fix the problem. Please let me know if it works
for you:
I would like to check it out, but I'm afraid I'm not able to compile
it.
Why not? What error are you getting?
I have not that much experience with compiling source code ...
David Fritz wrote:
Axel Pettinger wrote:
I have not that much experience with compiling source code ...
When I try to build WGET.EXE (w/o SSL) using MinGW then I get many
Forgot to mention that the source is 1.9+cvs-dev-200404081407 ...
warnings and errors in utils.c and log.c,
Axel Pettinger wrote:
David Fritz wrote:
Axel Pettinger wrote:
I have not that much experience with compiling source code ...
When I try to build WGET.EXE (w/o SSL) using MinGW then I get many
Forgot to mention that the source is 1.9+cvs-dev-200404081407 ...
warnings and errors in utils.c
Axel Pettinger [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
Axel Pettinger [EMAIL PROTECTED] writes:
Is there a reason for (or a solution to avoid it) the following
message: wget: strdup: Not enough memory. [1]
Does Wget exit after the error, or does it keep running?
Wget
Hrvoje Niksic wrote:
Perhaps it would be a good idea to provide `-d' output and recompile
Wget with memory debugging, and then do the same.
The following is Wget's debug output:
---
DEBUG output created by Wget 1.9.1 on Windows.
Axel Pettinger [EMAIL PROTECTED] writes:
The following is Wget's debug output:
[...]
Logged in!
== SYST ...
-- SYST
226 OK
done.== PWD ...
-- PWD
226 OK
wget: strdup: Not enough memory.
I think I understand where the bug is. The server doesn't seem to
send PWD in the format
Hrvoje Niksic wrote:
I think I understand where the bug is. The server doesn't seem to
send PWD in the format the code expects (in fact, it doesn't seem to
be sending it at all).
So one could indeed say that it is a strange ftp server, and not only a
bug in the wget code ...
Axel Pettinger [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
I think I understand where the bug is. The server doesn't seem to
send PWD in the format the code expects (in fact, it doesn't seem to
be sending it at all).
So one could indeed say that it is a strange ftp server, and not
Hrvoje Niksic wrote:
This patch should fix the problem. Please let me know if it works
for you:
I would like to check it out, but I'm afraid I'm not able to compile
it.
Why not? What error are you getting?
I have not that much experience with compiling source code ... When I
try
Axel Pettinger [EMAIL PROTECTED] writes:
Is there a reason for (or a solution to avoid it) the following
message: wget: strdup: Not enough memory. [1]
Does Wget exit after the error, or does it keep running?
Arno Schuring [EMAIL PROTECTED] writes:
The manual (man wget) doesn't say anything about redirecting the logs to
stdout; however, but since -O - is explicitly mentioned I figured I could
use the same for -o.
Sorry about that. Since -o prints to stdout (ok, stderr) by default,
I didn't
Hrvoje Niksic wrote:
Axel Pettinger [EMAIL PROTECTED] writes:
Is there a reason for (or a solution to avoid it) the following
message: wget: strdup: Not enough memory. [1]
Does Wget exit after the error, or does it keep running?
Wget terminates itself after the error. Is it possible
::BUMP::
Anyone there?
From: Heywood Jablome [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Subject: Wget Has Wgotten away from me...heheh..
Date: Thu, 22 Apr 2004 15:06:51 -0600
After man'ing and RTFM'ing for a while, I thought I'd post this question to
the group. I'm having some trouble keeping
I'm afraid Wget doesn't understand JavaScript. As your example
demonstrates, it is impossible to extract URLs from JavaScript by
merely parsing it -- you need to actually execute it.
It seems the site has an anti-leech protection designed to throw off
Wget. Adding something like `-U Mozilla' seems to make it work.
I see, thanks.
-
- Software is like sex, -
- it's better when -
- it's free-
- -
-|Linus Torvalds| -
-
On Sun, 18 Apr 2004, Hrvoje Niksic wrote:
It seems the site has an anti-leech
filesize limitation of fat32 is 2gb afaik
- Original Message -
From: Sebastian Armbrust [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Sunday, March 21, 2004 1:35 PM
Subject: Wget 1.9.1 resume problems
I discovered a problem with wget 1.9.1 (Windows Version) I try to
download the
Rick Goyette [EMAIL PROTECTED] writes:
The local and remote files have different sizes, which I thought
(after reading the man page) should flag wget to grab it. But it
does not.
It should. Do you use HTTP or FTP to get the file? Can you post a
debug log (possibly edited for confidential
PROTECTED], [EMAIL PROTECTED]
Subject: Re: wget-cvs-ifmodsince.patch
Date: Sat, 28 Feb 2004 02:52:07 +0100
Craig Sowadski [EMAIL PROTECTED] writes:
My only concern about only checking modification date is when there
is an incomplete download, the local modification date is set to the
current time
No way, sorry.
wget does not support javascript, so there is no way to have it follow that
kind of links.
Heiko
--
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED] [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax
-Original Message-
From: Raydeen A.
It surely would be nice if some day WGET could support javascript. Is that something
to put on the wish list or is it substantially impossible to implement? Do folks
use Java to load images in order to thwart 'bots such as WGET?
I run into the same problem regularly, and simply create a
Message-
From: Fred Holmes [mailto:[EMAIL PROTECTED]
Sent: Monday, March 15, 2004 3:09 PM
To: Herold Heiko; 'Raydeen A. Gallogly'; [EMAIL PROTECTED]
Subject: RE: Wget - relative links within a script call
aren't followed
It surely would be nice if some day WGET could support
On Thu, 4 Mar 2004, Hrvoje Niksic wrote:
Glen Sanft [EMAIL PROTECTED] writes:
This is not a problem which has just arisen in the recentest
release, but # in a path/file name, when properly escaped in a
URI, becomes itself in the filesystem, thus breaking the subsequent
local fetch.
David Fritz wrote:
Michael Bingel wrote:
Hi there,
I was looking for a tool to retrieve web pages and print them to
standard out. As windows user I tried wget from Cygwin, but it
created a file and I could not find the option to redirect output to
standard out.
Then I browsed throught the
Michael Bingel [EMAIL PROTECTED] writes:
I thought great, problem solved, but Cygwin wget version 1.9 does
not accept -O, although the NEWS file does not state removal of
this feature.
-O is still there. How exactly are you invoking Wget and what error
message is it printing?
Your official
Michael Bingel wrote:
Hi there,
I was looking for a tool to retrieve web pages and print them to
standard out. As windows user I tried wget from Cygwin, but it created a
file and I could not find the option to redirect output to standard out.
Then I browsed throught the online documentation
David Fritz [EMAIL PROTECTED] writes:
But, I'd guess you probably had a non-option argument before -O.
For a while now, the version of getopt_long() included with Cygwin
has had argument permutation disabled by default.
What on Earth were they thinking?! I've never considered the
possibility
Hrvoje Niksic wrote:
David Fritz writes:
But, I'd guess you probably had a non-option argument before -O.
For a while now, the version of getopt_long() included with Cygwin
has had argument permutation disabled by default.
What on Earth were they thinking?!
:) Well, ultimately, I can only
601 - 700 of 1221 matches
Mail list logo