[
https://issues.apache.org/jira/browse/COUCHDB-325?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12698850#action_12698850
]
Rohit Amarnath commented on COUCHDB-325:
----------------------------------------
I added the following code to the ruby program:
memory_usage = `ps -o rss= -p #{Process.pid}`.to_i # physical memory in
kilobytes
memory_usage = memory_usage + `ps -o vsz= -p #{Process.pid}`.to_i # Virtual
memory in kilobytes
if memory_usage > memory_last
puts "memory increased by #{memory_usage - memory_last} KB to
#{memory_usage/1024} MB"
elsif memory_usage < memory_last
puts "memory decreased by #{memory_usage - memory_last} KB to
#{memory_usage/1024} MB"
end
memory_last = memory_usage
The memory for ruby goes up to about 260MB and hovers between 230MB and 260MB,
but the memory usage on the system continues to increase to the max. Everytime
it does a bulk load the memory goes up by 30MB, but then the garbage collector
kicks in and reduces it back to 230MB.
> Memory Leak in 9.0?
> -------------------
>
> Key: COUCHDB-325
> URL: https://issues.apache.org/jira/browse/COUCHDB-325
> Project: CouchDB
> Issue Type: Bug
> Affects Versions: 0.9
> Environment: gentoo linux on c1.medium aws instance
> Reporter: Rohit Amarnath
> Priority: Minor
> Fix For: 0.9
>
>
> I am using couchrest to transform a document from one database to another
> (millions of records). The memory usage continues to increase till all the
> memory is used. The process occasionally fails, but usually completes. I dont
> know enough about ruby/couchrest/couchdb to be able to tell you where the
> memory is increasing from - but if you give me some direction, I will be
> happy to take a look.
> If I comment out the save, it seems ok.
> Here is the code:
> --------------------------------
> require 'rubygems'
> require 'couchrest'
> @db = CouchRest.database!("http://127.0.0.1:5984/xfpds_2008")
> @db2 = CouchRest.database!("http://127.0.0.1:5984/fpds_2008")
> @streamer = CouchRest::Streamer.new(@db)
> @streamer.view("_all_docs") do |row|
> begin
> doc = @db.get(row["id"])
> # remove id so the new database gets fresh document
> doc.delete("_id")
> doc.delete("_rev")
> # change badgerfish notation to assign $ key value to parent
> doc["amounts"]["obligatedAmount"] = doc["amounts"]["obligatedAmount"]["$"]
> doc["amounts"]["baseAndAllOptionsValue"] =
> doc["amounts"]["baseAndAllOptionsValue"]["$"]
> doc["amounts"]["baseAndExercisedOptionsValue"] =
> doc["amounts"]["baseAndExercisedOptionsValue"]["$"]
> ...... A whole bunch of fields
> # save the document using bulk save
> response = @db2.save_doc(doc,true)
> rescue
> # if streamer ends, save last few documents
> @db2.bulk_save
> end
> end
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.