Hi,
Don't know if this will be answered - but I had to
ask (since I DID read the man page! :)P )
Symptom : automating my stock research I type a
command as wget -p -H -k -nd -nH -x -Ota.html
-Dichart.finance.yahoo.com -Pbtu
Folks,
I ran wget (1.9.1) on Debian GNU/Linux to find out how many links my site had,
and after Queue count 66246, maxcount 66247 links, the wget process ran out
of memory. Is there a way to set the persistent state to disk instead of memory
so that all the system memory and cache is not
From: David David
3. Outputs the graph to ta.html (replacing original
ta.html)... BAD.
On VMS, where (by default) it's harder to write to an open file, the
symptom is different:
ta.html: file currently locked by another user
But the real question is: If a Web page has links to other
From: oscaruser
[...] wget (1.9.1) [...]
Wget version 1.10.2 is the current release.
[...] Is there a way to set the persistent state to disk instead of
memory [...]
I believe that there's a new computing concept called virtual
memory which would handle this sort of thing
[EMAIL PROTECTED] wrote:
I ran wget (1.9.1) on Debian GNU/Linux to find out how many links my site had,
and after Queue count 66246, maxcount 66247 links, the wget process ran out of
memory. Is there a way to set the persistent state to disk instead of memory so
that all the system memory and
Steven M. Schweda wrote:
But the real question is: If a Web page has links to other files, how
is Wget supposed to package all that stuff into _one_ file (which _is_
what -O will do), and still make any sense out of it?
even more, how is Wget supposed to properly postprocess the saved
From: Mauro Tortonesi [EMAIL PROTECTED]
perhaps we should make this clear in the manpage
Always a good idea.
and provide an
additional option which just renames saved files after download and
postprocessing according to a given pattern. IIRC, hrvoje was just
suggesting to do this
Folks,
I compiled and ran wget 1.10.2 on a machine with more memory (as shown below).
I meant to type it ran out of swap space not cache -- brain lapsed in my
old age. For very big runs, I wouldn't want to convert large amounts of disk
free space to swap space because that reduces the
~/wget/bin/wget -d --recursive --cookies=on --wait=2 --delete-after
--no-directories --reject css,js,jpg,gif http://192.168.1.101; 21 | gzip
log.0.gz
system is dual xeon
thanks
cat /proc/cpuinfo
processor : 0
vendor_id : GenuineIntel
cpu family : 15
model : 4
relating to:
GNU Wget 1.10.2 on Debian testing/unstable using Linux kernel 2.6.4
wget -N http://domain.tld/downloadfile -O outputfile
downloads outputfile
Doing it again does it again regardless of timestamp. It does not check
outputfile's timestamp against downloadfile, as prescribed by -N.
10 matches
Mail list logo