zhuyifei1999 added a comment.

I captured two guppy heaps (same process, the second is captured later the the first):

>>> hpy().heap()
Partition of a set of 1071160 objects. Total size = 112056904 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0 338785  32 35738656  32  35738656  32 unicode
     1 194260  18 19430488  17  55169144  49 str
     2 121408  11 17201088  15  72370232  65 list
     3 194812  18 16247616  14  88617848  79 tuple
     4   2416   0  9283840   8  97901688  87 dict (no owner)
     5 152771  14  3666504   3 101568192  91 float
     6  11481   1  1469568   1 103037760  92 types.CodeType
     7  11186   1  1342320   1 104380080  93 function
     8    413   0  1279160   1 105659240  94 dict of module
     9   1141   0  1029432   1 106688672  95 type
<422 more rows. Type e.g. '_.more' to view.>
>>> _.more
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
    10   1141   0   880120   1 107568792  96 dict of type
    11  24661   2   591864   1 108160656  97 int
    12    337   0   357016   0 108517672  97 dict of class
    13     92   0   256160   0 108773832  97 dict of pkg_resources._vendor.pyparsing.Literal
    14    240   0   251520   0 109025352  97 dict of pywikibot.page.Link
    15    828   0   231840   0 109257192  97 dict of function
    16    758   0   212240   0 109469432  98 dict of pywikibot.site._IWEntry
    17    109   0   173368   0 109642800  98 dict of pkg_resources._vendor.pyparsing.And
    18   1858   0   163504   0 109806304  98 __builtin__.weakref
    19     55   0   130600   0 109936904  98 dict of pkg_resources._vendor.pyparsing.Regex
<412 more rows. Type e.g. '_.more' to view.>
>>>

>>> hpy().heap()
Partition of a set of 1075151 objects. Total size = 112025536 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0 338839  32 35483120  32  35483120  32 unicode
     1 198688  18 19789624  18  55272744  49 str
     2 121420  11 17245856  15  72518600  65 list
     3 194972  18 16259536  15  88778136  79 tuple
     4   2208   0  9204864   8  97983000  87 dict (no owner)
     5 152775  14  3666600   3 101649600  91 float
     6  11481   1  1469568   1 103119168  92 types.CodeType
     7  11179   1  1341480   1 104460648  93 function
     8    415   0  1281256   1 105741904  94 dict of module
     9   1129   0  1018584   1 106760488  95 type
<398 more rows. Type e.g. '_.more' to view.>
>>> _.more
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
    10   1129   0   876760   1 107646464  96 dict of type
    11  24665   2   591960   1 108238424  97 int
    12    331   0   355336   0 108593760  97 dict of class
    13     92   0   256160   0 108849920  97 dict of pkg_resources._vendor.pyparsing.Literal
    14    240   0   251520   0 109101440  97 dict of pywikibot.page.Link
    15    828   0   231840   0 109333280  98 dict of function
    16    758   0   212240   0 109545520  98 dict of pywikibot.site._IWEntry
    17    109   0   173368   0 109718888  98 dict of pkg_resources._vendor.pyparsing.And
    18   1837   0   161656   0 109880544  98 __builtin__.weakref
    19     55   0   130600   0 110011144  98 dict of pkg_resources._vendor.pyparsing.Regex
<388 more rows. Type e.g. '_.more' to view.>

The OS still reports the memory resident size is increasing, but guppy says it's decreasing... I guess core dumps is the way to go :(


TASK DETAIL
https://phabricator.wikimedia.org/T185561

EMAIL PREFERENCES
https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: zhuyifei1999
Cc: gerritbot, Dalba, Xqt, Zoranzoki21, zhuyifei1999, Aklapper, pywikibot-bugs-list, Dvorapa, Giuliamocci, Adrian1985, Cpaulf30, Baloch007, Darkminds3113, Lordiis, Adik2382, Th3d3v1ls, Ramalepe, Liugev6, Magul, Tbscho, rafidaslam, MayS, Lewizho99, Mdupont, JJMC89, Maathavan, Avicennasis, jayvdb, Masti, Alchimista, Rxy
_______________________________________________
pywikibot-bugs mailing list
pywikibot-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/pywikibot-bugs

Reply via email to