On Tue, Aug 12, 2008 at 16:57, Ralph Shumaker <[EMAIL PROTECTED]> wrote:

> Way back when I was running whendoze (around 10 years ago), I had a
> shareware program called File Hound.  One of the many features that I liked
> about that program is that if I had it running while browsing the internet,
> and I right click on something and choose "copy link location", File Hound
> would take it from there and fetch the object of that link, in the
> background, while I continued surfing.  It even cached subsequent links and
> got to them as soon as it could.
>
> I've been trying to think of how I might be able to do something like that
> in linux.  wget is just as tenacious as File Hound for files whose download
> keeps breaking.  I think there's a way to tell wget not to bother fetching
> duplicates.  What I'm not sure about is how to get wget to run like a
> daemon, and listen for copied links.  Any ideas?
>
>
>
You might like flashgot to fill in the blank. I don't know much about it,
but just saw it and thought about this thread.
https://addons.mozilla.org/en-US/firefox/addon/220


-- 
JD Runyan


Bob Hope  - "You know you are getting old when the candles cost more than
the cake."

-- 
KPLUG-List@kernel-panic.org
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to