On Tue, 5 Sep 2000, it was written:
> Although strickly NOT a Red Hat "problem", hopefully fellow Red Hat Users
> might be able to "assist" (since they maybe effected)??? At two different
> internet locations I am having difficulty getting FTP Downloads of a RH
> ISO Image to complete (zoot-i386.iso: 671,881,216 bytes)!!! Even using
> applications such as ncftp/ws_ftp (which allow transfers to resume), data
> stops transfering at the same point into the file (Location#1 538,883,080
> & Location#2 456,509,440 bytes)??? I can NOT duplicate the problem in my
> local LAN tests (they all completed), hence I was wondering if there is
> something in the way data moves across the Internet which could effect FTP
> Transfers (preventing completion)?
With other large files I am experiencing similar problems with downloads
failing to complete, although the "block" is different for each file!
Additionally after truncating the file (so that it contains just the
"missing" data), these downloads fails even to start (having already seen
NCFTP's "get -C filename" on the original file fail)??? A "cludge" (which
is working successfully) is FTPing UUencoded versions of the same binary
files (UUdecoding at the other end)! Especially curious: the text version
downloaded at almost twice the rate of the binary one, leading me to
believe this there is compress data occuring??? Also fearful that it
might be "fault" within the data compression which is preventing the
binary downloads from completing, ie. specific patterns within the data
stream being the "problem"? The connection is ADSL (Flashcom), anyone
know where data compression might be occuring???
Lawrence Houston - ([EMAIL PROTECTED])
_______________________________________________
Redhat-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-list