DO NOT REPLY TO THIS EMAIL, BUT PLEASE POST YOUR BUG RELATED COMMENTS THROUGH THE WEB INTERFACE AVAILABLE AT <http://nagoya.apache.org/bugzilla/show_bug.cgi?id=13540>. ANY REPLY MADE TO THIS MESSAGE WILL NOT BE COLLECTED AND INSERTED IN THE BUG DATABASE.
http://nagoya.apache.org/bugzilla/show_bug.cgi?id=13540 log factor 5 max record memory overflow possible Summary: log factor 5 max record memory overflow possible Product: Log4j Version: 1.0 Platform: Other OS/Version: Other Status: NEW Severity: Normal Priority: Other Component: Other AssignedTo: [EMAIL PROTECTED] ReportedBy: [EMAIL PROTECTED] The problem is that if you have no logs being displayed ( or a few very rarely seen logs) but you are getting logs for other things, the system can run out of memory. This is due to lf5 not bothering to check the max number of records if it didn't display a record, the result is that the non-displayed records were being constantly added to a Vector, but that Vector was never being checked to see if its size exceeded the max number of records you had specified. To fix change org.apache.log4j.lf5.viewer.FilteredLogTableModel.java: public synchronized boolean addLogRecord(LogRecord record) { _allRecords.add(record); // moved to always check if it needs trimming trimRecords(); if (_filter.passes(record) == false) { return false; } getFilteredRecords().add(record); fireTableRowsInserted(getRowCount(), getRowCount()); return true; } public synchronized void fastRefresh() { // added to always check if it needs trimming if(_filteredRecords.size() >= _maxNumberOfLogRecords) { _filteredRecords.remove(0); fireTableRowsDeleted(0, 0); } } -- To unsubscribe, e-mail: <mailto:[EMAIL PROTECTED]> For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>