Re: Big files

2008-09-24 Thread Michelle Konzack
Am 2008-09-16 15:22:22, schrieb Cristián Serpell: It is the latest Ubuntu's distribution, that still comes with the old version. Ehm, even Debian Etch comes with: [EMAIL PROTECTED]:~] apt-cache policy wget wget: Installiert:1.10.2-2 Mögliche Pakete:1.10.2-2 Versions-Tabelle: ***

Re: Big files

2008-09-24 Thread Michelle Konzack
There must be an other Bug, since I can download small (:-) 18 GByte of archive files... Debian Etch: [EMAIL PROTECTED]:~] apt-cache policy wget wget: Installiert:1.10.2-2 Mögliche Pakete:1.10.2-2 Versions-Tabelle: *** 1.10.2-2 0 500 file: etch/main Packages 100

Re: Big files

2008-09-24 Thread Michelle Konzack
Am 2008-09-16 12:52:16, schrieb Tony Lewis: Cristián Serpell wrote: Maybe I should have started by this (I had to change the name of the file shown): [snip] ---response begin--- HTTP/1.1 200 OK Date: Tue, 16 Sep 2008 19:37:46 GMT Server: Apache Last-Modified: Tue, 08 Apr 2008

Big files

2008-09-16 Thread Cristián Serpell
? Is there an option for downloading big files? In this case, I used curl. Please CC replies, I'm not a suscriber Thanks! C S

Re: Big files

2008-09-16 Thread Doruk Fisek
Tue, 16 Sep 2008 11:19:50 -0400, Cristián Serpell [EMAIL PROTECTED] : I would like to know if there is a reason for using a signed int for the length of the files to download. The thing is that I was trying to download a 2.3 GB file using wget, but then the length was printed as a negative

RE: Big files

2008-09-16 Thread Tony Lewis
Cristián Serpell wrote: I would like to know if there is a reason for using a signed int for the length of the files to download. I would like to know why people still complain about bugs that were fixed three years ago. (More accurately, it was a design flaw that originated from a time when

Re: Big files

2008-09-16 Thread Cristián Serpell
It is the latest Ubuntu's distribution, that still comes with the old version. Thanks anyway, that was the problem. El 16-09-2008, a las 15:08, Tony Lewis escribió: Cristián Serpell wrote: I would like to know if there is a reason for using a signed int for the length of the files to

Re: Big files

2008-09-16 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Cristián Serpell wrote: It is the latest Ubuntu's distribution, that still comes with the old version. Thanks anyway, that was the problem. I know that's untrue. Ubuntu comes with 1.10.2 at least, and has for quite some time. If you're using

Re: Big files

2008-09-16 Thread Cristián Serpell
Maybe I should have started by this (I had to change the name of the file shown): [EMAIL PROTECTED]:/tmp# wget --version GNU Wget 1.10.2 Copyright (C) 2005 Free Software Foundation, Inc. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the

RE: Big files

2008-09-16 Thread Tony Lewis
Cristián Serpell wrote: Maybe I should have started by this (I had to change the name of the file shown): [snip] ---response begin--- HTTP/1.1 200 OK Date: Tue, 16 Sep 2008 19:37:46 GMT Server: Apache Last-Modified: Tue, 08 Apr 2008 20:17:51 GMT ETag: 7f710a-8a8e1bf7-47fbd2ef

wget always downloads big files even if already present

2006-06-20 Thread Tony Schreiner
Hi I use wget to mirror some sites, including: ftp://ftp.ncbi.nih.gov/ blast/db/FASTA I'm using the CentOS 4/RHEL 4 wget version 1.10.2-0.40E I am finding that big files get downloaded each time even if they are already present, older and of the same size. I think I can trace the problem

--timestamping and big files?

2005-05-28 Thread Dan Bolser
I think --timestamping fails for files 2Gb wget tries to download the file again with the .1 extension (as if you were not using --timestamping). This only happens to a big file in a list of files I am wgetting.

Re: --timestamping and big files?

2005-05-28 Thread Hrvoje Niksic
Dan Bolser [EMAIL PROTECTED] writes: I think --timestamping fails for files 2Gb Thanks for the report. Wget 1.9.x doesn't support 2+GB files, not only for timestamping. You can try Wget 1.10-beta from ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2

Re: --timestamping and big files?

2005-05-28 Thread Dan Bolser
On Sat, 28 May 2005, Hrvoje Niksic wrote: Dan Bolser [EMAIL PROTECTED] writes: I think --timestamping fails for files 2Gb Thanks for the report. Wget 1.9.x doesn't support 2+GB files, not only for timestamping. You can try Wget 1.10-beta from

bug while handling big files

2004-12-24 Thread Leonid
Hi, Simone, Santa put a patch for you in http://software.lpetrov.net/wget-LFS/ Unwrap carefully and enjoy. Merry Christmas, Leonid 24-DEC-2004 21:02:03

bug while handling big files

2004-12-23 Thread Simone Bastianello
Hello. I was retrieving this iso: ftp://ftp.slackware.no/pub/linux/ISO-images/Slackware/Current-ISO-build/slackware-10.0-DVD.iso I killed wget and then I resumed it with wget -c (file was downlaoded for 2285260288 bytes) here's the output: --19:31:47--