wget -N url -O file won't check timestamp

2006-05-25 Thread David Graham
relating to: GNU Wget 1.10.2 on Debian testing/unstable using Linux kernel 2.6.4 wget -N http://domain.tld/downloadfile -O outputfile downloads outputfile Doing it again does it again regardless of timestamp. It does not check outputfile's timestamp against downloadfile, as prescribed by -N. wge

Re: Out of Memory Error

2006-05-25 Thread oscaruser
~/wget/bin/wget -d --recursive --cookies=on --wait=2 --delete-after --no-directories --reject css,js,jpg,gif "http://192.168.1.101"; 2>&1 | gzip > log.0.gz system is dual xeon thanks cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 15 model : 4

Re: Out of Memory Error

2006-05-25 Thread oscaruser
Folks, I compiled and ran wget 1.10.2 on a machine with more memory (as shown below). I meant to type it ran out of "swap" space not "cache" -- brain lapsed in my old age. For very big runs, I wouldn't want to convert large amounts of disk free space to swap space because that reduces the usabi

Re: WGET -O Help

2006-05-25 Thread Steven M. Schweda
From: Mauro Tortonesi <[EMAIL PROTECTED]> > perhaps we should make this clear in the manpage Always a good idea. > and provide an > additional option which just renames saved files after download and > postprocessing according to a given pattern. IIRC, hrvoje was just > suggesting to do t

Re: WGET -O Help

2006-05-25 Thread Mauro Tortonesi
Steven M. Schweda wrote: But the real question is: If a Web page has links to other files, how is Wget supposed to package all that stuff into _one_ file (which _is_ what -O will do), and still make any sense out of it? even more, how is Wget supposed to properly postprocess the saved data,

Re: Out of Memory Error

2006-05-25 Thread Mauro Tortonesi
[EMAIL PROTECTED] wrote: I ran wget (1.9.1) on Debian GNU/Linux to find out how many links my site had, and after "Queue count 66246, maxcount 66247" links, the wget process ran out of memory. Is there a way to set the persistent state to disk instead of memory so that all the system memory an

Re: WGET Out of Memory Error

2006-05-25 Thread Steven M. Schweda
From: oscaruser > [...] wget (1.9.1) [...] Wget version 1.10.2 is the current release. > [...] Is there a way to set the persistent state to disk instead of > memory [...] I believe that there's a new computing concept called "virtual memory" which would handle this sort of thing automati

Re: WGET -O Help

2006-05-25 Thread Steven M. Schweda
From: David David > 3. Outputs the graph to ta.html (replacing original > ta.html)... BAD. On VMS, where (by default) it's harder to write to an open file, the symptom is different: ta.html: file currently locked by another user But the real question is: If a Web page has links to other

Out of Memory Error

2006-05-25 Thread oscaruser
Folks, I ran wget (1.9.1) on Debian GNU/Linux to find out how many links my site had, and after "Queue count 66246, maxcount 66247" links, the wget process ran out of memory. Is there a way to set the persistent state to disk instead of memory so that all the system memory and cache is not slow

WGET -O Help

2006-05-25 Thread David David
Hi, Don't know if this will be answered - but I had to ask (since I DID read the man page! :)P ) Symptom : automating my stock research I type a command as "wget -p -H -k -nd -nH -x -Ota.html -Dichart.finance.yahoo.com -Pbtu "http://finance.yahoo.com/q/ta?s=btu&t=6m&l=on&z=l&q=b&p=b,p,s,v&a