I have a long-running task that generates a CSV file for exporting.
Here's how it works:
1. An Export object is created, and the job is queued in DJ
2. the process iterates over the dataset with ActiveRecord::find_each
and writes the result to a tempfile.
3. The tempfile is uploaded to S3 with Paperclip
4. The export record is saved

This works for small-ish datasets, say less than 8,000 rows. However,
I have a memory leak somewhere that causes the memory usage for that
process to quickly balloon over 400MB, and the process is eventually
killed off. The problem is that I just can't find where the memory is
leaking.

I've created a little debugging routine that updates the Export record
every 100 rows with the progress and the memory usage. Here's a link
to the Export model code: http://gist.github.com/277544

This all works fine on my local machine - there's no memory leak, and
the memory stays at a constant 120 MB throughout the export process.
So I can't figure out if it's a problem with my code, or a gem that's
being used on Heroku, or what...

So basically I was wondering if anyone else has experienced any memory
leak issues, and if any of this looks familiar, and if so how you
solved it?
-- 
You received this message because you are subscribed to the Google Groups 
"Heroku" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/heroku?hl=en.


Reply via email to