I don't think puf is totally functioning equal to aget.  ;-)

puf, just as its name, fetches bunch of URLs in parallel, more like 
another wget with parallelism. However, it cannot simply download a 
large file (i.e. a kernel archive like, an .iso image, etc.) in 
parallel, esp. over a not so fast network. Aget can fill this gap by 
dividing the large file downloaded into multiple parts, each of which is 
handled by a pthread, and facilitates the falling over from download 
failures.

Moreover, aget is based on BSD(-like) license, and its been ported to 
OpenBSD, NetBSD, FreeBSD, and Linux, etc. long before. This is another 
reason for selecting aget as multi-threaded HTTP file downloader on Solaris.

Thanks,
-Louis

On 2009/03/07, 04:50, James Carlson wrote:
> James Walker writes:
>   
>>       Aget is a program for mutli-threaded HTTP downloading in the 
>>       text mode. It fetches HTTP URLs in a manner similar to wget, but 
>>       segments the retrieval into multiple parts to increase download
>>       speed, while each parts could be resumed automatically once 
>>       download failure occurs. Aget can be many times as fast as wget
>>       in some circumstances.
>>     
>
> Why aget and not puf?  Have you compared them?
>
>   


Reply via email to