Re: Big files

2008-09-24 Thread Michelle Konzack
Am 2008-09-16 15:22:22, schrieb Cristián Serpell:
 It is the latest Ubuntu's distribution, that still comes with the old  
 version.

Ehm, even Debian Etch comes with:

[EMAIL PROTECTED]:~] apt-cache policy wget
wget:
  Installiert:1.10.2-2
  Mögliche Pakete:1.10.2-2
  Versions-Tabelle:
 *** 1.10.2-2 0
500 file: etch/main Packages
100 /var/lib/dpkg/status

So Ubunti use AFAIK the latest version which is 1.11...

Thanks, Greetings and nice Day/Evening
Michelle Konzack
Systemadministrator
24V Electronic Engineer
Tamay Dogan Network
Debian GNU/Linux Consultant


-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
+49/177/935194750, rue de Soultz MSN LinuxMichi
+33/6/61925193 67100 Strasbourg/France   IRC #Debian (irc.icq.com)


signature.pgp
Description: Digital signature


Re: Big files

2008-09-24 Thread Michelle Konzack
There must be an other Bug, since I can  download small (:-) 18 GByte of
archive files...  Debian Etch:

[EMAIL PROTECTED]:~] apt-cache policy wget
wget:
  Installiert:1.10.2-2
  Mögliche Pakete:1.10.2-2
  Versions-Tabelle:
 *** 1.10.2-2 0
500 file: etch/main Packages
100 /var/lib/dpkg/status

Thanks, Greetings and nice Day/Evening
Michelle Konzack
Systemadministrator
24V Electronic Engineer
Tamay Dogan Network
Debian GNU/Linux Consultant


-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
+49/177/935194750, rue de Soultz MSN LinuxMichi
+33/6/61925193 67100 Strasbourg/France   IRC #Debian (irc.icq.com)


signature.pgp
Description: Digital signature


Re: Big files

2008-09-24 Thread Michelle Konzack
Am 2008-09-16 12:52:16, schrieb Tony Lewis:
 Cristián Serpell wrote:
 
  Maybe I should have started by this (I had to change the name of the  
  file shown):
 [snip]
  ---response begin---
  HTTP/1.1 200 OK
  Date: Tue, 16 Sep 2008 19:37:46 GMT
  Server: Apache
  Last-Modified: Tue, 08 Apr 2008 20:17:51 GMT
  ETag: 7f710a-8a8e1bf7-47fbd2ef
  Accept-Ranges: bytes
  Content-Length: -1970398217

Interesting Headrs, since here, I get

  HTTP/1.1 200 OK
  Date: Mon, 22 Sep 2008 21:58:11 GMT
  Server: Apache/2.2.3 (Debian) PHP/5.2.0-8+etch10
  X-Powered-By: PHP/5.2.0-8+etch10

which mean, he is running the old crapy apache 1.3.
 
 The problem is not with wget. It's with the Apache server, which told wget
 that the file had a negative length.

Because it is the old indian.

Thanks, Greetings and nice Day/Evening
Michelle Konzack
Systemadministrator
24V Electronic Engineer
Tamay Dogan Network
Debian GNU/Linux Consultant


-- 
Linux-User #280138 with the Linux Counter, http://counter.li.org/
# Debian GNU/Linux Consultant #
Michelle Konzack   Apt. 917  ICQ #328449886
+49/177/935194750, rue de Soultz MSN LinuxMichi
+33/6/61925193 67100 Strasbourg/France   IRC #Debian (irc.icq.com)


signature.pgp
Description: Digital signature


Big files

2008-09-16 Thread Cristián Serpell

Hi

I would like to know if there is a reason for using a signed int for  
the length of the files to download. The thing is that I was trying to  
download a 2.3 GB file using wget, but then the length was printed as  
a negative number and wget said Aborted. Is it a bug or a design  
decision? Is there an option for downloading big files? In this case,  
I used curl.


Please CC replies, I'm not a suscriber

Thanks!
C S


Re: Big files

2008-09-16 Thread Doruk Fisek
Tue, 16 Sep 2008 11:19:50 -0400, Cristián Serpell
[EMAIL PROTECTED] :

 I would like to know if there is a reason for using a signed int for  
 the length of the files to download. The thing is that I was trying
 to download a 2.3 GB file using wget, but then the length was printed
 as a negative number and wget said Aborted. Is it a bug or a
 design decision?
Which version of wget are you using? It was a bug of older wget
versions. You can see it with the output of wget --version command
(latest version is 1.11.4).

I'm not having any trouble with downloading files bigger than 2G.

   Doruk

--
FISEK INSTITUTE - http://www.fisek.org.tr


RE: Big files

2008-09-16 Thread Tony Lewis
Cristián Serpell wrote:

 I would like to know if there is a reason for using a signed int for  
 the length of the files to download.

I would like to know why people still complain about bugs that were fixed
three years ago. (More accurately, it was a design flaw that originated from
a time when no computer OS supported files that big, but regardless of what
you call it, the change to wget was made to version 1.10 in 2005.)

Tony




Re: Big files

2008-09-16 Thread Cristián Serpell
It is the latest Ubuntu's distribution, that still comes with the old  
version.


Thanks anyway, that was the problem.

El 16-09-2008, a las 15:08, Tony Lewis escribió:


Cristián Serpell wrote:


I would like to know if there is a reason for using a signed int for
the length of the files to download.


I would like to know why people still complain about bugs that were  
fixed
three years ago. (More accurately, it was a design flaw that  
originated from
a time when no computer OS supported files that big, but regardless  
of what

you call it, the change to wget was made to version 1.10 in 2005.)

Tony






Re: Big files

2008-09-16 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Cristián Serpell wrote:
 It is the latest Ubuntu's distribution, that still comes with the old
 version.
 
 Thanks anyway, that was the problem.

I know that's untrue. Ubuntu comes with 1.10.2 at least, and has for
quite some time. If you're using that, then it's probably a different
bug than Doruk and Tony were thinking of (perhaps one of the cases of
content-length mishandling that were recently fixed in the 1.11.x series).

IIRC Intrepid Ibex (Ubuntu 8.10) will have 1.11.4.

- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFI0AnI7M8hyUobTrERAqptAJoCj0VC46dBOhrr/A3HsHyicciKWQCffyFQ
bHhmuYHmf52Yz1M5lu7Yk5Y=
=Z+fN
-END PGP SIGNATURE-


Re: Big files

2008-09-16 Thread Cristián Serpell
Maybe I should have started by this (I had to change the name of the  
file shown):


[EMAIL PROTECTED]:/tmp# wget --version
GNU Wget 1.10.2

Copyright (C) 2005 Free Software Foundation, Inc.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
GNU General Public License for more details.

Originally written by Hrvoje Niksic [EMAIL PROTECTED].

[EMAIL PROTECTED]:/tmp# wget --debug http://program-linux64.tar.bz2
DEBUG output created by Wget 1.10.2 on linux-gnu.

--15:37:42--  http://program-linux64.tar.bz2
   = `program.tar.bz2'
Resolving www.ai.sri.com... 130.107.65.215
Caching www.ai.sri.com = 130.107.65.215
Connecting to www.ai.sri.com|130.107.65.215|:80... connected.
Created socket 3.
Releasing 0x0064a100 (new refcount 1).

---request begin---
GET /program-linux64.tar.bz2 HTTP/1.0
User-Agent: Wget/1.10.2
Accept: */*
Host: www.ai.sri.com
Connection: Keep-Alive

---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.1 200 OK
Date: Tue, 16 Sep 2008 19:37:46 GMT
Server: Apache
Last-Modified: Tue, 08 Apr 2008 20:17:51 GMT
ETag: 7f710a-8a8e1bf7-47fbd2ef
Accept-Ranges: bytes
Content-Length: -1970398217
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: application/x-tar

---response end---
200 OK
Registered socket 3 for persistent reuse.
Length: -1,970,398,217 [application/x-tar]

[ =]  
0 --.--K/s


Aborted

El 16-09-2008, a las 15:32, Micah Cowan escribió:


-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Cristián Serpell wrote:

It is the latest Ubuntu's distribution, that still comes with the old
version.

Thanks anyway, that was the problem.


I know that's untrue. Ubuntu comes with 1.10.2 at least, and has for
quite some time. If you're using that, then it's probably a different
bug than Doruk and Tony were thinking of (perhaps one of the cases of
content-length mishandling that were recently fixed in the 1.11.x  
series).


IIRC Intrepid Ibex (Ubuntu 8.10) will have 1.11.4.

- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFI0AnI7M8hyUobTrERAqptAJoCj0VC46dBOhrr/A3HsHyicciKWQCffyFQ
bHhmuYHmf52Yz1M5lu7Yk5Y=
=Z+fN
-END PGP SIGNATURE-





RE: Big files

2008-09-16 Thread Tony Lewis
Cristián Serpell wrote:

 Maybe I should have started by this (I had to change the name of the  
 file shown):
[snip]
 ---response begin---
 HTTP/1.1 200 OK
 Date: Tue, 16 Sep 2008 19:37:46 GMT
 Server: Apache
 Last-Modified: Tue, 08 Apr 2008 20:17:51 GMT
 ETag: 7f710a-8a8e1bf7-47fbd2ef
 Accept-Ranges: bytes
 Content-Length: -1970398217

The problem is not with wget. It's with the Apache server, which told wget
that the file had a negative length.

Tony



wget always downloads big files even if already present

2006-06-20 Thread Tony Schreiner

Hi

I use wget to mirror some sites, including: ftp://ftp.ncbi.nih.gov/ 
blast/db/FASTA


I'm using the CentOS 4/RHEL 4 wget version 1.10.2-0.40E

I am finding that big files get downloaded each time even if they are  
already present, older and of the same size. I think I can trace the  
problem to the parsing in ftp-ls.c


When it looks for the file size, at line 968

char *t = ptok;
...


in the case where the file name is big, the backing up procedure  
backs up all the way to the previous null in the token, because the  
ftp listing only has a single space between the owner and the file  
size. So then the statement


size = str_to_wgint(t, NULL, 10)

is  trying to convert a null string. All that needs to be done, I  
think is

add

t++;

before
size = str_to_wgint(t, NULL, 10);

See if you agree.
Thanks

Tony Schreiner
Biology Department
Boston College
[EMAIL PROTECTED]


--timestamping and big files?

2005-05-28 Thread Dan Bolser

I think --timestamping fails for files  2Gb

wget tries to download the file again with the .1 extension (as if you
were not using --timestamping).

This only happens to a big file in a list of files I am wgetting.



Re: --timestamping and big files?

2005-05-28 Thread Hrvoje Niksic
Dan Bolser [EMAIL PROTECTED] writes:

 I think --timestamping fails for files  2Gb

Thanks for the report.  Wget 1.9.x doesn't support 2+GB files, not
only for timestamping.  You can try Wget 1.10-beta from
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2


Re: --timestamping and big files?

2005-05-28 Thread Dan Bolser
On Sat, 28 May 2005, Hrvoje Niksic wrote:

Dan Bolser [EMAIL PROTECTED] writes:

 I think --timestamping fails for files  2Gb

Thanks for the report.  Wget 1.9.x doesn't support 2+GB files, not
only for timestamping.  You can try Wget 1.10-beta from
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2


Sorry for the lack of details, I sent the email before I realized I didn't
even give you the version...

Thanks for the reply,
Dan.



bug while handling big files

2004-12-24 Thread Leonid
Hi, Simone,
  Santa put a patch for you in http://software.lpetrov.net/wget-LFS/
Unwrap carefully and enjoy. Merry Christmas,
Leonid
24-DEC-2004 21:02:03


bug while handling big files

2004-12-23 Thread Simone Bastianello
Hello.
I was retrieving this iso:
ftp://ftp.slackware.no/pub/linux/ISO-images/Slackware/Current-ISO-build/slackware-10.0-DVD.iso
I killed wget and then I resumed it with wget -c (file was downlaoded 
for 2285260288 bytes)
here's the output:

--19:31:47--  
ftp://ftp.slackware.no/pub/linux/ISO-images/Slackware/Current-ISO-build/slackware-10.0-DVD.iso
  = `slackware-10.0-DVD.iso'
Resolving ftp.slackware.no... 158.36.2.10
Connecting to ftp.slackware.no|158.36.2.10|:21... connected.
Logging in as anonymous ... Logged in!
== SYST ... done.== PWD ... done.
== TYPE I ... done.  == CWD 
/pub/linux/ISO-images/Slackware/Current-ISO-build ... done.
== SIZE slackware-10.0-DVD.iso ... done.
== PORT ... done.== REST -2009707008 ...
REST failed, starting from scratch.

== RETR slackware-10.0-DVD.iso ... done.
Length: -1,027,217,408 [982,489,600 to go] (unauthoritative)
   0K 0%
0.00 B/s

slackware-10.0-DVD.iso: Bad address, closing control connection.
Note also that while downloading I get negative % and negative length.
Hope to hear from you soon,
Simone Bastianello