On Sat, 28 Oct 2006 17:12:21 +0800, "Jed R. Mallen" <[EMAIL PROTECTED]> wrote:
<snip>
>>
>> i just use wget for downloading large files (if the server permits it)
> since i get away from depending on a gui layer to have my downloads. it is
> better to have less points of failure if you are getting huge files. :)
>>
>> now if only wget supports multi-server swarming...  too bad the prozilla
> development is virtually dead. :(
>>
>> <snip>
> 
> mozilla downThemAll! extension is good enough imho.
> 

will try that extension but that still adds the gui-dependency which wget 
already does away with. i may use it if it also has the throttling feature. our 
network admin is going to kill me if i hog all the bandwidth. :)

ciao!

-- 
"Programming, an artform that fights back"

Anuerin G. Diaz
Registered Linux User #246176
Friendly Linux Board @ http://mandrivausers.org/index.php
http://capsule.ramfree17.org , when you absolutely have nothing else better to 
do

_________________________________________________
Philippine Linux Users' Group (PLUG) Mailing List
[email protected] (#PLUG @ irc.free.net.ph)
Read the Guidelines: http://linux.org.ph/lists
Searchable Archives: http://archives.free.net.ph

Reply via email to