Re: 2 Gb limitation

2002-01-11 Thread Ian Abbott
On 10 Jan 2002 at 17:09, Matt Butt wrote: I've just tried to download a 3Gb+ file (over a network using HTTP) with WGet and it died at exactly 2Gb. Can this limitation be removed? In principle, changes could be made to allow wget to be configured for large file support, by using the

Re: Using -pk, getting wrong behavior for frameset pages...Suggestions?

2002-01-11 Thread Picot Chappell
Thanks for your response. I tried the same command, using your URL, and it worked fine. So I took a look at the site I was retrieving for the failed test. It's a ssl site (didn't think about it before) and I noticed 2 things. The Frame source pages were not downloaded (they were for

wget does not parse .netrc properly

2002-01-11 Thread Alexey Aphanasyev
Hello everyone, I'm using wget compiled from the latest CVS sources (GNU Wget 1.8.1+cvs). I use it to mirror several ftp sites. I keep ftp accounts in .netrc file which looks like this: quote file=.netrc # My ftp accounts machine host1 login user1 password pwd1 machine host2

Re: Using -pk, getting wrong behavior for frameset pages...Suggestions?

2002-01-11 Thread Thomas Reinke
Do you think this might be an issue with framesets and ssl sites? or an issue with framesets and cgi source files? This is not a problem with frames - it IS a problem with SSL. wget, while it appears to have SSL support, didn't quite get it right. The internal schems being used don't treat

Re: Using -pk, getting wrong behavior for frameset pages...Suggestions?

2002-01-11 Thread Ian Abbott
On 11 Jan 2002 at 10:51, Picot Chappell wrote: Thanks for your response. I tried the same command, using your URL, and it worked fine. So I took a look at the site I was retrieving for the failed test. It's a ssl site (didn't think about it before) and I noticed 2 things. The Frame

-H suggestion

2002-01-11 Thread Fred Holmes
WGET suggestion The -H switch/option sets host-spanning. Please provide a way to specify a different limit on recursion levels for files retrieved from foreign hosts. -r -l0 -H2 for example would allow unlimited recursion levels on the target host, but only 2 [addtional] levels when a file

Suggestion on job size

2002-01-11 Thread Fred Holmes
It would be nice to have some way to limit the total size of any job, and have it exit gracefully upon reaching that size, by completing the -k -K process upon termination, so that what one has downloaded is useful. A switch that would set the total size of all downloads --total-size=600MB

Re: Suggestion on job size

2002-01-11 Thread Jens Rösner
Hi Fred! First, I think this would rather belong in the normal wget list, as I cannot see a bug here. Sorry to the bug tracers, I am posting to the normal wget List and cc-ing Fred, hope that is ok. To your first request: -Q (Quota) should do precisely what you want. I used it with -k and it