jolan <[EMAIL PROTECTED]> writes:
> I'm currently using wget to mirror a rather extensive website. So
> far I've mirrored 10 gigs and counting. wget has been running for roughly
> 24 hours. My problem is the huge amount of memory wget has alloted to
> it. Since I have it converting links, I was wondering if maybe all the
> webpages I have been mirroring are being stored in memory for later
> conversion to relative links.
No, this is not the case.
Yes, Wget can allocate a lot of memory when mirroring huge sites, but
I have not been able to detect a memory leaks, and I tried it with
many memory debugging tools.
So if Wget has allocated 44M so far, it will probably not allocate
I still haven't investigated where all the memory goes and if it is a
bug, but I plan to do so.