begin  quoting Ralph Shumaker as of Tue, Aug 12, 2008 at 02:57:16PM -0700:
> Way back when I was running whendoze (around 10 years ago), I had a 
> shareware program called File Hound.  One of the many features that I 
> liked about that program is that if I had it running while browsing the 
> internet, and I right click on something and choose "copy link 
> location", File Hound would take it from there and fetch the object of 
> that link, in the background, while I continued surfing.  It even cached 
> subsequent links and got to them as soon as it could.
> 
> I've been trying to think of how I might be able to do something like 
> that in linux.  wget is just as tenacious as File Hound for files whose 
> download keeps breaking.  I think there's a way to tell wget not to 
> bother fetching duplicates.  What I'm not sure about is how to get wget 
> to run like a daemon, and listen for copied links.  Any ideas?
> 

while [ true ] ; do
   echo -n "url: "
   read url
   wget -b -c -nc --random-wait "$url"
done

-- 
It's an idea.
Stewart Stremler


-- 
KPLUG-List@kernel-panic.org
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to