Re: [Req #2433] wget project at sunsite.dk
On Wed, 18 Dec 2002 [EMAIL PROTECTED] wrote: > "Reply to questions and help out" cannot get wget release to anywhere else. > My last email simply asked for a change in the way wget gets updated and > released, no matter it is one person or multiple people "wget team". Now, let's not be totally ignorant here, please. Since the team is only one person and that single person is away, it will be hard to get help from the team. If the team would consist of more people, it would be less likely that the whole team would all be away at the same time. I did not complain on your behaviour or anyting, I was only mentioning the need for a wget team bigger than one person. And I've done this before. Multiple times. -- Daniel Stenberg - http://daniel.haxx.se - +46-705-44 31 77 ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol
Downloaded Filenames incorret when they contain '[' and ']'
I believe I've found a bug in Wget. (Then again, perhaps I'm just being ignorant of something important). This bug ONLY occurs in FTP downloads. HTTP works fine. Bug Description: Downloading a file whose name contains a '[' or ']' will succeed BUT the local name of the file will be incorrect after the download has occurred. The local name will contain URL encoding of these two "wildcard" characters. In my case, I'm downloading files with unusual names like '2002-49.[08-Dec].1.mbm' (don't ask!). If this file was downloaded with the command: wget --glob=off "ftp://someftp.com/2002-49.[08-Dec].1.mbm"; it works BUT the name of the local file is '2002-49.@[EMAIL PROTECTED]'. You'll notice that the '[' and ']' characters in the original file have been replaced with '@5B' and '@5D' respectively. I did some quick experiments with file names containing other wildcard characters, like "2000-53.?31" and "2000-53.*31", and found the downloads failed completely. Apart from using HTTP, which would be a pain, is there any obvious work around? And, no, I can't change the file names. :-) Many thanks for Wget. Its great. -Mike.
Re: Problem with large file (> 2 Go) transfert
On Wed, 2002-12-18 at 17:17, [EMAIL PROTECTED] wrote: > I've a problem with wget, and hope that you'll be able to help me : > > I've to download biological DBs on a weekly basis and I use wget.on a > PC-linux system > The problem is that one on the DBs has recently raised the size of... > 2186384957 bytes ! > (cf ftp://ftp.ncbi.nih.gov/blast/db/nt.Z) > > I noticed an error during download. > In fact, the file was correctly downloaded, but wget returned an error > status, and the progress status showed a negative progress (from 0 to > -101%). > The header of the log file was as follow : > > --16:25:38-- ftp://ftp.ncbi.nih.gov/%2Fblast/db/nt.Z >=> `nt.Z' > ==> CWD not required. > ==> PORT ... done.==> RETR nt.Z ... done. > Length: -2,108,582,339 > > I use the latest version (1.8.2), that I tried to compile with the 'large > file' options (-D_FILE_OFFSET_BITS=64), without any success... wget 1.8.2 couldn't work with files larger than 2gb Please try the patch from this URL: http://bugs.debian.org/137989 I added it to the Debian wget package and didn't get any mail that it doesn't work. -- Noèl Köthe
Re: [Req #2433] wget project at sunsite.dk
"Reply to questions and help out" cannot get wget release to anywhere else. My last email simply asked for a change in the way wget gets updated and released, no matter it is one person or multiple people "wget team". On Wed, 18 Dec 2002, Daniel Stenberg wrote: > On Tue, 17 Dec 2002 [EMAIL PROTECTED] wrote: > > > I felt the wget devel team wasn't active since 1.82 release. I have sent > > two patches without getting reply or anything from the wget devel team. For > > an important software such as wget, the current situation needs remedy. > > The "wget team" you're talking about is one single person: Hrvoje. > > There are many people subscribed to the general wget mailing list and they > repeatedly and frequently reply to questions and help out. > > It has been said and could be said again: the "wget team" should be extended > to consist of more people to help Hrvoje out during periods when he can't > donate as much of his time as wget needs. > > Just my own opinions of course. > > -- > Daniel Stenberg - http://daniel.haxx.se - +46-705-44 31 77 >ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol >
Re: [Req #2433] wget project at sunsite.dk
> "Esben" == Esben Haabendal Soerensen <[EMAIL PROTECTED]> writes: Esben> I should be removed from that list, so sorry :( I was there for test purposes and initial import - I'm not approved to make changes and should also be removed... Karsten
Problem with large file (> 2 Go) transfert
Hi, I've a problem with wget, and hope that you'll be able to help me : I've to download biological DBs on a weekly basis and I use wget.on a PC-linux system The problem is that one on the DBs has recently raised the size of... 2186384957 bytes ! (cf ftp://ftp.ncbi.nih.gov/blast/db/nt.Z) I noticed an error during download. In fact, the file was correctly downloaded, but wget returned an error status, and the progress status showed a negative progress (from 0 to -101%). The header of the log file was as follow : --16:25:38-- ftp://ftp.ncbi.nih.gov/%2Fblast/db/nt.Z => `nt.Z' ==> CWD not required. ==> PORT ... done.==> RETR nt.Z ... done. Length: -2,108,582,339 I use the latest version (1.8.2), that I tried to compile with the 'large file' options (-D_FILE_OFFSET_BITS=64), without any success... Thanks for any help you could give me. Congratulations for wget : really nice tool ! T. Vermat Thierry Vermat Bioinformatics GENOME Express 11 Chemin des Pres tel: 04.56.38.11.14 38944 Meylan - Francefax: 04.56.38.11.00 http://www.genomex.com e-mail : [EMAIL PROTECTED]
A suggestion for `man wget'
Hello, This is not a bug, but could you please add in the manual, after the sentence "The proxy is on by default if the appropriate environmental variable is defined." that this variable is called "http_proxy". It is not easy to guess. Yours, U. Elias