-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
HARPREET SAWHNEY wrote:
Hi,
I am getting a strange bug when I use wget to download a binary file
from a URL versus when I manually download.
The attached ZIP file contains two files:
05.upc --- manually downloaded
dum.upc---
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
HARPREET SAWHNEY wrote:
Hi,
Thanks for the prompt response.
I am using
GNU Wget 1.10.2
I tried a few things on your suggestion but the problem remains.
1. I exported the cookies file in Internet Explorer and specified
that in the Wget
---BeginMessage---
Hi,I am using wget 1.10.2 in Windows 2003.And the same problem like Cantara.
The file system is NTFS.
Well I find my problem is, I wrote the command in schedule tasks like this:
wget -N -i D:\virus.update\scripts\kavurl.txt -r -nH -P
d:\virus.update\kaspersky
well, after
Hrvoje Niksic wrote:
Subject:
Re: Wget Bug: recursive get from ftp with a port in the url fails
From:
baalchina [EMAIL PROTECTED]
Date:
Mon, 17 Sep 2007 19:56:20 +0800
To:
[EMAIL PROTECTED]
To:
[EMAIL PROTECTED]
Message-ID:
[EMAIL PROTECTED]
MIME-Version:
1.0
Content-Type
On Mon, 9 Jul 2007 15:06:52 +1200
[EMAIL PROTECTED] wrote:
wget under win2000/win XP
I get No such file or directory error messages when using the follwing
command line.
wget -s --save-headers
http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc;
%1 = 212BI
Any ideas?
hi
Mauro Tortonesi schrieb:
On Mon, 9 Jul 2007 15:06:52 +1200
[EMAIL PROTECTED] wrote:
wget under win2000/win XP
I get No such file or directory error messages when using the follwing
command line.
wget -s --save-headers
http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc;
%1 = 212BI
wget under win2000/win XP
I get No such file or directory error messages when using the follwing
command line.
wget -s --save-headers
http://www.nndc.bnl.gov/ensdf/browseds.jsp?nuc=%1class=Arc;
%1 = 212BI
Any ideas?
thank you
Dr Nikolaus Hermanspahn
Advisor (Science)
National Radiation
Highlord Ares wrote:
it tries to download web pages named similar to
http://site.com?variable=yesmode=awesome
http://site.com?variable=yesmode=awesome
Since is a reserved character in many command shells, you need to quote
the URL on the command line:
wget
when I run wget on a certain sites, it tries to download web pages named
similar to http://site.com?variable=yesmode=awesome. However, wget isn't
saving any of these files, no doubt because of some file naming issue? this
problem exists in both the Windows unix versions.
hope this helps
PROTECTED] On Behalf Of Highlord Ares
Sent: Thursday, May 24, 2007 11:41
To: [EMAIL PROTECTED]
Subject: wget bug
when I run wget on a certain sites, it tries to download web pages named
similar to http://site.com?variable=yesmode=awesome. However, wget isn't
saving any of these files, no doubt
Hi,
I am trying to download a Wiki category for off-line browsing,
and am using a command-line like this:
wget http://wiki/Category:Fish -r -l 1 -k
Wiki categories contain colons in their filenames, for example:
Category:Fish
If I request that wget convert absolute paths to relative links,
Hi,
I am trying to download a Wiki category for off-line browsing,
and am using a command-line like this:
wget http://wiki/Category:Fish -r -l 1 -k
Wiki categories contain colons in their filenames, for example:
Category:Fish
If I request that wget convert absolute paths to relative links,
Paul Bickerstaff [EMAIL PROTECTED] wrote in
news:[EMAIL PROTECTED]:
I'm using wget version GNU Wget 1.10.2 (Red Hat modified) on a fedora
core5 x86_64 system (standard wget rpm). I'm also using version 1.10.2b
on a WinXP laptop. Both display the same faulty behaviour which I don't
believe
well this really isn't a bug per say... but whenever you set -q for no output , it still makes a wget log file on the desktop.
From dev:
I checked and the .wgetrc file has continue=on. Is there any way to
surpress the sending of getting by byte range? I will read through the
email and see if I can gather some more information that may be needed.
Remove continue=on from .wgetrc?
Consider:
-N, --timestamping
I was running wget to test mirroring an internal development site, and
using large database dumps (binary format) as part of the content to
provide me with a large number of binary files for the test. For the
test I wanted to see if wget would run and download a quantity of 500K
files with
I was running wget to test mirroring an internal development site, and
using large database dumps (binary format) as part of the content to
provide me with a large number of binary files for the test. For the
test I wanted to see if wget would run and download a quantity of 500K
files with
I was running wget to test mirroring an internal development site, and
using large database dumps (binary format) as part of the content to
provide me with a large number of binary files for the test. For the
test I wanted to see if wget would run and download a quantity of 500K
files with
1. It would help to know the wget version (wget -V).
2. It might help to see some output when you add -d to the wget
command line. (One existing file should be enough.) It's not
immediately clear whose fault the 416 error is. It might also help to
know which Web server is running on the
Hello,
We are using version 1.10.2 of wget under Ubuntu and Debian. So we have
many
scripts that get some images from a cacti site. These scripts ran perfectly
with version 1.9 of wget but they can not get image with version 1.10.2 of
wget.
Here you can find an example of our
Jesse Cantara [EMAIL PROTECTED] writes:
A quick resolution to the problem is to use the -nH command line
argument, so that wget doesn't attempt to create that particular
directory. It appears as if the problem is with the creation of a
directory with a ':' in the name, which I cannot do
I've encountered a bug when trying to do a recursive get from an ftp site with a non-standard port defined in the url, such as ftp.somesite.com:1234.An example of the command I am typing is:
wget -r ftp://user:[EMAIL PROTECTED]:4321/Directory/*Where Directory contains multiple subdirectories, all
Hi folks,
I think I have found a bug in wget where it fails to change the working
directory when retrying a failed ftp transaction. This is wget 1.10.2 on
FreeBSD-6.0/amd64.
I was trying to use wget to get files from a broken ftp server which
occasionally sends garbled responses, causing
[EMAIL PROTECTED] (Steven M. Schweda) writes:
and adding it fixed many problems with FTP servers that log you in
a non-/ working directory.
Which of those problems would _not_ be fixed by my two-step CWD for
a relative path? That is: [...]
That should work too. On Unix-like FTP servers,
From: Hrvoje Niksic
[...] On Unix-like FTP servers, the two methods would
be equivalent.
Right. So I resisted temptation, and kept the two-step CWD method in
my code for only a VMS FTP server. My hope was that some one would look
at the method, say That's a good idea, and change the if
Hello,
current wget seems to have the following bug in the ftp retrieval code:
When called like:
wget user:[EMAIL PROTECTED]/foo/bar/file.tgz
and foo or bar is a read/execute protected directory while file.tgz is
user-readable, wget fails to retrieve the file because it tries to CWD
into the
Arne Caspari [EMAIL PROTECTED] writes:
When called like:
wget user:[EMAIL PROTECTED]/foo/bar/file.tgz
and foo or bar is a read/execute protected directory while file.tgz is
user-readable, wget fails to retrieve the file because it tries to CWD
into the directory first.
I think the correct
Hrvoje Niksic wrote:
Arne Caspari [EMAIL PROTECTED] writes:
I believe that CWD is mandated by the FTP specification, but you're
also right that Wget should try both variants.
i agree. perhaps when retrieving file A/B/F.X we should try to use:
GET A/B/F.X
first, then:
CWD A/B
GET F.X
if
Thank you all for your very fast response. As a further note: When this
error occurs, wget bails out with the following error message:
No such directory foo/bar.
I think it should instead be Could not access foo/bar: Permission
denied or similar in such a situation.
/Arne
Mauro Tortonesi
Mauro Tortonesi [EMAIL PROTECTED] writes:
Hrvoje Niksic wrote:
Arne Caspari [EMAIL PROTECTED] writes:
I believe that CWD is mandated by the FTP specification, but you're
also right that Wget should try both variants.
i agree. perhaps when retrieving file A/B/F.X we should try to use:
GET
Hrvoje Niksic [EMAIL PROTECTED] writes:
That might work. Also don't prepend the necessary prepending of $CWD
to those paths.
Oops, I meant don't forget to prepend
From: Hrvoje Niksic
Also don't [forget to] prepend the necessary [...] $CWD
to those paths.
Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to
those paths.
As you might recall from my changes for VMS FTP servers (if you had
ever looked at them), this scheme causes no end
On Fri, 25 Nov 2005, Steven M. Schweda wrote:
Or, better yet, _DO_ forget to prepend the trouble-causing $CWD to those
paths.
I agree. What good would prepending do? It will most definately add problems
such as those Steven describes.
--
-=- Daniel Stenberg -=-
From: Hrvoje Niksic
Prepending is already there,
Yes, it certainly is, which is why I had to disable it in my code for
VMS FTP servers.
and adding it fixed many problems with
FTP servers that log you in a non-/ working directory.
Which of those problems would _not_ be fixed by my
Begin forwarded message:
From: [EMAIL PROTECTED]
Date: October 4, 2005 4:36:09 AM GMT+02:00
To: [EMAIL PROTECTED]
Subject: failure notice
Hi. This is the qmail-send program at sunsite.dk.
I'm afraid I wasn't able to deliver your message to the following
addresses.
This is a permanent
Sorry for the crosspost, but the wget Web site is a little confusing on the
point of where to send bug reports/patches.
Just installed wget 1.10 on Friday. Over the weekend, my scripts failed with
the
following error (once for each wget run):
Assertion failed: wget_cookie_jar != NULL, file
Arndt Humpert [EMAIL PROTECTED] writes:
wget, win32 rel. crashes with huge files.
Thanks for the report. This problem has been fixed in the latest
version, available at http://xoomer.virgilio.it/hherold/ .
Hello,
wget, win32 rel. crashes with huge files.
regards
[EMAIL PROTECTED]
___
Gesendet von Yahoo! Mail - Jetzt mit 250MB Speicher kostenlos - Hier anmelden:
http://mail.yahoo.de== Command Line
wget -m
Title: WGET Bug?
#
C:\Grabtest\wget.exe -r --tries=3 http://www.xs4all.nl/~npo/ -o C:/Grabtest/Results/log
#
--16:23:02-- http://www.xs4all.nl/%7Enpo
OS = Solaris
8
Platform =
Sparc
Test command =
/usr/local/bin/wget -r -t0 -m ftp://root:[EMAIL PROTECTED]/usr/openv/var
The directory will
count to some sub-direcotry's andfiles to
synchronize.
Example
:
# ls -la
/usr/openv/total 68462drwxr-xr-x 14 root
bin 512 set 1 17:52
Zitat von Tony O'Hagan [EMAIL PROTECTED]:
Original path: abc def/xyz pqr.gif
After wget mirroring: abc%20def/xyz pqr.gif (broken link)
wget --version is GNU Wget 1.8.2
This was a well-known error in the 1.8 versions of wget, which is already
corrected in the 1.9
Recently I used the following wget command under a hosted linux account:
$ wget -mirror url -o mirror.log
The web site contained files and virtual directories that contained spaces
in the names.
URL encoding translated these spaces to %20.
wget correctly URL decoded the file names (creating
It seems that wget uses a signed 32 bit value for the content-length in HTTP. I
haven't looked at the code, but it appears that this is what is happening.
The problem is that when a file larger than about 2GB is downloaded, wget
reports negative numbers for it's size and quits the download
I got a crash in wget downloading a large iso file (2,4 GB)
newdeal:/pub/isos# wget -c
ftp://ftp.belnet.be/linux/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso
--09:22:17--
ftp://ftp.belnet.be/linux/fedora/linux/core/3/i386/iso/FC3-i386-DVD.iso
= `FC3-i386-DVD.iso'
Resolving
Hello!
I am very pleased to use wget to crawl pages. It is an excellent tool.
Recently I find a bug in using wget, although I am not sure wether it's a bug
or an incorrect usage. I just to want to report here.
When I use wget to mirror or recursively download a web site with -O
option, I
Hello,
Probably I am just too lazy, haven't spent enough time to read the man, and
wget can actually do exactly what I want.
If so -- I do apologize for taking your time.
Otherwise: THANKS for your time!..:-).
My problem is:
redirects.
I am trying to catch them by using, say, netcat
On Wed, 21 Jan 2004 23:07:30 -0800, you wrote:
Hello,
I think I've come across a little bug in wget when using it to get a file
via ftp.
I did not specify the passive option, yet it appears to have been used
anyway Here's a short transcript:
Passive FTP can be specified in /etc/wgetrc or
Hello.
Problem: When downloading all in
http://udn.epicgames.com/Technical/MyFirstHUD
wget overwrites the downloaded MyFirstHUD file with
MyFirstHUD directory (which comes later).
GNU Wget 1.9.1
wget -k --proxy=off -e robots=off --passive-ftp -q -r -l 0 -np -U Mozilla $@
Solution: Use of -E
I sent this message to [EMAIL PROTECTED] as directed in the wget man page, but it
bounced and said to try this email address.
This bug report is for GNU Wget 1.8.2 tested on both RedHat Linux 7.3 and 9
rpm -q wget
wget-1.8.2-9
When I use a wget with the -S to show the http headers, and I use
Hi,
While downloading a file of about 3,234,550,172 bytes with wget
http://foo/foo.mpg; I get an error:
HTTP request sent, awaiting response... 200 OK
Length: unspecified [video/mpeg]
[
=
] -1,060,417,124 13.10M/s
don [EMAIL PROTECTED] writes:
I did not specify the passive option, yet it appears to have been used
anyway Here's a short transcript:
[EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip
--21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip
=
Kairos [EMAIL PROTECTED] writes:
$ cat wget.exe.stackdump
[...]
What were you doing with Wget when it crashed? Which version of Wget
are you running? Was it compiled for Cygwin or natively for Windows?
$ cat wget.exe.stackdump
Exception: STATUS_ACCESS_VIOLATION at eip=77F51BAA
eax= ebx= ecx=0700 edx=610CFE18 esi=610CFE08 edi=
ebp=0022F7C0 esp=0022F74C program=C:\nonspc\cygwin\bin\wget.exe
cs=001B ds=0023 es=0023 fs=0038 gs= ss=0023
Stack trace:
Frame Function
Here is debug output
:/FTPD# wget ftp://ftp.dcn-asu.ru/pub/windows/update/winxp/xpsp2-1224.exe -d
DEBUG output created by Wget 1.8.1 on linux-gnu.
--13:25:55--
The problem is that the server replies with login incorrect, which
normally means that authorization has failed and that further retries
would be pointless. Other than having a natural language parser
built-in, Wget cannot know that the authorization is in fact correct,
but that the server
Kempston [EMAIL PROTECTED] writes:
Yeah, i understabd that, but lftp hadles it fine even without
specifying any additional option ;)
But then lftp is hammering servers when real unauthorized entry
occurs, no?
I`m sure you can work something out
Well, I'm satisfied with what Wget does now.
jayme [EMAIL PROTECTED] writes:
[...]
Before anything else, note that the patch originally written for 1.8.2
will need change for 1.9. The change is not hard to make, but it's
still needed.
The patch didn't make it to canonical sources because it assumes `long
long', which is not available on
I tried the patch Debian bug report 137989 and didnt work. Can anybody explain:
1 - why I have to make to directories for patch work: one wget-1.8.2.orig and one
wget-1.8.2 ?
2 - why after compilation the wget still cant download the file 2GB ?
note : I cut the patch for debian use ( the first
It's probably a bug:
bug: when downloading
wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg,
wget saves it as-is, but when downloading
wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg
--
The human knowledge belongs to the world
Hi Jack :)
* Jack Pavlovsky [EMAIL PROTECTED] dixit:
It's probably a bug:
bug: when downloading
wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg,
wget saves it as-is, but when downloading
wget ftp://somehost.org/somepath/3*, wget saves the files as
3acv14%7Eanivcd.mpg
Jack Pavlovsky [EMAIL PROTECTED] writes:
It's probably a bug: bug: when downloading wget -mirror
ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is,
but when downloading wget ftp://somehost.org/somepath/3*, wget saves
the files as 3acv14%7Eanivcd.mpg
Thanks for the report.
Dear Sir:
I tried to use "wget" download data from ftp site but got error message
as following:
> wget ftp://ftp.ngdc.noaa.gov/pub/incoming/RGON/anc_1m.OCT
Screen show:
Using wget 1.8.2:
$ wget --page-requisites http://news.com.com
...fails to retrieve most of the files that are required to properly
render the HTML document, because they are forbidden by
http://news.com.com/robots.txt .
I think that use of --page-requisites implies that wget is being used
Funk Gabor wrote:
HTTP does not provide a dirlist command, so wget parses html to find
other files it should download. Note: HTML not XML. I suspect that
is the problem.
If wget wouldn't download the rest, I'd say that too. But 1st the dir
gets created, the xml is dloaded (in some other
It seems wget uses a 32 bit integer for the bytes downloaded:
[...]
FINISHED --17:11:26--
Downloaded: 1,047,520,341 bytes in 5830 files
cave /home/suse8.0# du -s
5230588 .
cave /home/suse8.0#
As it's a once per download variable I'd say it's not that performance
critical...
Hi, i have a problem and would
really like you to help me. i`m using wget for downloading list of file
urlsvia http proxy. When proxy server goes
offline - wget doesn`t retry downloading of files. Can you fix that or can you
tell me how can i fix that ?
:15003/Dragon
=
`dragon.004
Connecting to 195.108.41.140:3128... failed: Connection
refused.
FINISHED
--01:19:23--
Downloaded: 150,000,000 bytes in 10 files
- Original Message -
From:
Kempston
To: [EMAIL PROTECTED]
Sent: Monday, July 08, 2002 12:50
AM
Subject: WGET BUG
I'm afraid that downloading files larger than 2G is not supported by
Wget at the moment.
fbsd1 --- http wget eshop.tar (3.3G) --- fbsd2
command was:
# wget http://kamenica/eshop.tar
at the second G i got the following:
2097050K .. .. .. .. .. 431.03 KB/s
2097100K .. .. .. .. ..8.14 MB/s
2097150K
On Monday 18 February 2002 17:52, you wrote:
That would be great. The prob is that I'm using it to retrieve files mostly
on servers that are having too much users. No I don't want to hammer the
server but I do want to keep on trying with reasonable intervals until I get
the file.
I think the
[The message I'm replying to was sent to [EMAIL PROTECTED]. I'm
continuing the thread on [EMAIL PROTECTED] as there is no bug and
I'm turning it into a discussion about features.]
On 18 Feb 2002 at 15:14, TD - Sales International Holland B.V. wrote:
I've tried -w 30
--waitretry=30
--wait=30
[ Please mail bug reports to [EMAIL PROTECTED], not to me directly. ]
Nuno Ponte [EMAIL PROTECTED] writes:
I get a segmentation fault when invoking:
wget -r
http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html
My Wget version is 1.7-3, the one which is
HI,
When I try to send a page to Nextel mobileusing the following command from unix box,
"wget http://www.nextel.com/cgi-bin/sendPage.cgi?to01=4157160856%26message=hellothere%26action=send"
The wget returns the following message but the page is not reaching the phone.
"--15:59:16--
Dear sir.
When I out to my browser (NN'3) line
http://find.infoart.ru/cgi-bin/yhs.pl?hidden=http%3A%2F%2F194.67.26.82word=FreeBSD
wget working correctly.
When I put this line to wget, wget change this line;
argument hidden is http:/194.67.26.82word,
argument word is empty. Where I am wrong?
Hack Kampbjørn [EMAIL PROTECTED] writes:
You have hit one of Wget features, it is overzealous in converting
URLs into canonical form. As you have discovered Wget first converts
all encoded characters back to their real value and then encodes all
those that are unsafe sending in URLs.
It's a
Hello,
I am using wget to invoke a CGI script call, while passing it several
variables. For example:
wget -O myfile.txt
"http://user:[EMAIL PROTECTED]/myscript.cgi?COLOR=blueSHAPE=circle"
where myscript.cgi say, makes an image based on the parameters "COLOR" and
"SHAPE". The problem I am
Hello,
I've found a (less important) bug in wget. I've been dowloading
a file from FTP server and the control connection of the FTP service
was closed by the server. After that wget started to print incorrectly
progress information (beyond 100%).
The log follows:
Which version of wget do you use ? Are you aware that wget 1.6 has been
released and 1.7 is in development (and they contain a workaround for the
"Lying FTP server syndrome" you are seeing) ?
--
Csaba Rduly, Software Engineer Sophos Anti-Virus
email: [EMAIL PROTECTED]
78 matches
Mail list logo