http://bugzilla.spamassassin.org/show_bug.cgi?id=3875

           Summary: Bayes tokenizer method uses a lot of memory when hashing
                    raw tokens
           Product: Spamassassin
           Version: 3.0.0
          Platform: Other
        OS/Version: other
            Status: NEW
          Severity: normal
          Priority: P5
         Component: Learner
        AssignedTo: [email protected]
        ReportedBy: [EMAIL PROTECTED]


It looks like a bad interaction with map/grep and the sha1 call.  The fix is to
get rid of the map and turn it into a foreach loop.



------- You are receiving this mail because: -------
You are the assignee for the bug, or are watching the assignee.

Reply via email to