Issue #18114 has been updated by Josh Cooper.

True, though we probably make more than a few copies, e.g. 
`Puppet::FileBucketFile::File#verify_identical_file!`results in old and new 
files to be read into memory, and then we do a string comparison. We should 
first check the lengths, if they're different we can avoid all of that IO. We 
should also be streaming the file reads and comparing each chunk.
----------------------------------------
Bug #18114: Recursive filebucket backup consumes way too much memory
https://projects.puppetlabs.com/issues/18114#change-78820

Author: Erik Dalén
Status: Accepted
Priority: Normal
Assignee: 
Category: filebucket
Target version: 
Affected Puppet version: 3.0.1
Keywords: 
Branch: 


On a host where puppet decided to do a recursive filebucket backup of ~12000 
files of total 400MB this caused the puppetmaster to consistently allocate more 
than 30GB and then get killed by the OOM killer.


-- 
You have received this notification because you have either subscribed to it, 
or are involved in it.
To change your notification preferences, please click here: 
http://projects.puppetlabs.com/my/account

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Bugs" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/puppet-bugs?hl=en.

Reply via email to