Jim Paris hat am Tue 25. Oct, 14:26 (-0400) geschrieben:
> Noël Köthe wrote:
> > Am Freitag, den 23.09.2011, 23:23 +0200 schrieb Jörg Sommer:
> > 
> > > downloading a https site with -r or -p makes wget grows up to 500MB and
> > > more. For version 1.12 this wasn't the case.
> > > 
> > > % time wget -p -nv https://www.fsf.org
> > ...
> > > wget -p -nv https://www.fsf.org  55,63s usr 3,64s sys 2:44,49 tot 254MB 0 
> > > 77726 pf 345 27781 cs
> > ...
> > > Versions of packages wget depends on:
> > ...
> > > ii  libgnutls26    2.12.10-2     
> > 
> > The difference between 1.12 and 1.13 is that upstream switched from
> > openssl to gnutls. With wget 1.13 and 1.13.4 and libgnutls26 2.12.12 I
> > get:
> > # LC_ALL=C /usr/bin/time wget --debug -O /dev/null https://www.google.com/
> > ...
> > 0.54user 0.05system 0:01.53elapsed 38%CPU (0avgtext+0avgdata 
> > 77392maxresident)k
> > 0inputs+0outputs (0major+5474minor)pagefaults 0swaps
> > 
> > With gnutls 2.12.12 and wget 1.13.4 you still have the same high memory
> > consumtion for https downloads?
> 
> With wget 1.13.4-1 and libgnutls26 2.12.12-1:
> 
>   # LC_ALL=C /usr/bin/time wget --debug -O /dev/null https://www.google.com/
>   ...
>   11.29user 0.39system 0:12.48elapsed 93%CPU (0avgtext+0avgdata 
> 2086656maxresident)k
>   0inputs+0outputs (0major+131068minor)pagefaults 0swaps
> 
> How many files do you have in /etc/ssl/certs?  That seems to be the
> cause here.

The problem seams to be, that the certificates get loaded for every
connection:

% strace -o /tmp/wget.st -e trace=file =wget -q --spider -r -l 1 
https://fsfe.org/
^C
LC_ALL=C strace -fvttT -o /tmp/wget.st -e trace=file =wget -q --spider -r -l   
107,09s usr 20,87s sys 4:33,34 tot 258MB 157 85855 pf 761641 41557 cs

% grep -o 'open../etc/ssl.*)' /tmp/wget.st |sort |uniq -c |awk '{print $1}' 
|sort -u
37
38

> If I remove all of the individual certificates and keep only the
> bundle:
> 
>   # cd /etc/ssl/certs
>   # ls | wc -l
>   474

Me, too.

> Then it's fast again:
> 
>   # LC_ALL=C /usr/bin/time wget --debug -O /dev/null https://www.google.com/
>   ...
>   0.11user 0.00system 0:00.36elapsed 32%CPU (0avgtext+0avgdata 
> 20480maxresident)k
>   0inputs+0outputs (0major+1458minor)pagefaults 0swaps

And if you download multiple files, e.g. run wget recursively? Sorry, I
can't try it myself.

I've run valgrind, but had to kill it, because it used too much memory.
But the summary looks like there's a memory leak.

% valgrind =wget -q --spider https://fsfe.org/
==4100== HEAP SUMMARY:
==4100==     in use at exit: 46,366,821 bytes in 1,604,336 blocks
==4100==   total heap usage: 16,559,135 allocs, 14,954,799 frees, 1,365,739,117 
bytes allocated
==4100== 
==4100== LEAK SUMMARY:
==4100==    definitely lost: 94,924 bytes in 2,151 blocks
==4100==    indirectly lost: 37,810,070 bytes in 1,342,897 blocks
                             ^^^^^^^^^^^^^^^^
==4100==      possibly lost: 2,955,933 bytes in 69,663 blocks
==4100==    still reachable: 5,505,894 bytes in 189,625 blocks
==4100==         suppressed: 0 bytes in 0 blocks
==4100== Rerun with --leak-check=full to see details of leaked memory
==4100== 

Bye, Jörg.
-- 
UNIX is user friendly, it's just picky about who its friends are

Attachment: signature.asc
Description: Digital signature http://en.wikipedia.org/wiki/OpenPGP

Reply via email to