linux version crashes when reaching the max size limit
I was downloading a too large file to a fat32 file system and it crashed. The file was 4294967295 bytes after crashing. Later it I tried to resume it with -c and it didn't crash, it said that the max file size limit was reached and exited. It's Wget 1.10 for Linux. _ Moda para esta temporada. Ponte al día de todas las tendencias. http://www.msn.es/Mujer/moda/default.asp
Re: Question!
At 2006-11-07 02:57, Yan Qing Chen wrote: Hi wget, I had found a problem when i try to mirror a ftp site use wget. i use it with -m -b prameters. some files will be recopy when every mirror time. i will how to config a mirror site? Thanks Best Regards, Hi, when modified date reported by web server headers is newer than the timestamp of your file, wget will retrieve again the page (this is correct, modified files have to be retrieved again), maybe this is the cause of your problem. If web server will report a new modified date at every load of the page, even nothing has modified, this is a web server misconfiguration, or maybe intentional config, or maybe bad written dynamic pages (asp, php, etc.) who don't care about the issue. HTH, Andrea
Re: linux version crashes when reaching the max size limit
From: Toni Casueps i. it crashed is not a helpful description of what happened. What actually happened? 2. If the file is too large for a FAT32 file system, what would you like to happen? 4294967295 looks like 2^32-1, which (from what I've read) is the maximum size of a file on a FAT32 file system. 3. Wget 1.10.2 is the latest released version. Complaints about older versions normally lead to a suggestion to try the latest version. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547
Re: .listing files and ftp_proxy
I realise that I may not have provided enough information to get an answer to this... I've tried this using the latest version (1.10.2) on Debian Linux 3.1 However, I've also tried with a variety of earlier versions on other platforms and it looks as though it has never worked on any platform. If anybody knows if this is a bug or something that just can't/won't be fixed, I'd be very, very grateful for an answer. We use wget a lot, and it's just perfect for our needs. However, some of our customers are stuck behind a proxy and can't use the scripts we've developed that use wget because of this problem. Thanks, David David Creasy wrote: Hi, I've looked, but been unable to find the answer to this rather simple question. (It's been asked before, but I can't see an answer.) wget --passive-ftp --dont-remove-listing -d ftp://ftp.ebi.ac.uk/; gives me a .listing file, but: wget -e ftp_proxy=http://proxy:1234 --passive-ftp --dont-remove-listing -d ftp://ftp.ebi.ac.uk/; just gives me the index.html file and no .listing file. Using alternate ways of specifying the proxy server doesn't make any difference. Is there any easy fix for this, or is it the same as: http://www.mail-archive.com/wget@sunsite.dk/msg08572.html Thanks in advance for any advice, David -- David Creasy