[ 
https://issues.apache.org/jira/browse/COUCHDB-2058?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13898336#comment-13898336
 ] 

Rohit Sharma commented on COUCHDB-2058:
---------------------------------------

The previous stats I posted were when CouchDB had consumer 60% of memory and it 
was like that since the morning when I ran the process.

I rebooted the CouchDB and it has release the memory now and below are the 
stats when CouchDB is fresh.

erlang:memory([atom, atom_used, processes, processes_used, binary, code, ets]).
[{atom,453713},
 {atom_used,425597},
 {processes,589248},
 {processes_used,576328},
 {binary,184},
 {code,3425481},
 {ets,237464}]
5> [erlang:garbage_collect(P) || P <- processes()].                             
  
[true,true,true,true,true,true,true,true,true,true,true,
 true,true,true,true,true,true,true,true,true,true,true,true,
 true,true]
6> erlang:memory([atom, atom_used, processes, processes_used, binary, code, 
ets]).
[{atom,453713},
 {atom_used,425597},
 {processes,584336},
 {processes_used,571416},
 {binary,184},
 {code,3425481},
 {ets,237464}]

> CouchDB Memory Leak - Beam.smp
> ------------------------------
>
>                 Key: COUCHDB-2058
>                 URL: https://issues.apache.org/jira/browse/COUCHDB-2058
>             Project: CouchDB
>          Issue Type: Bug
>      Security Level: public(Regular issues) 
>          Components: Database Core
>            Reporter: Rohit Sharma
>
> Hello,
> I am experiencing performance issue with CouchDB.
> Use Case: I am working on a process that retrieves the data from RDBMS and 
> process them into JSON document and POST them to the CouchDB.
> I am trying to POST around half a million documents, most of them in batches 
> (_bulk_doc) of 10,000 and have tried with batch of 5,000, 15,000, and 20,000.
> Whole process takes around 90-100 minutes.
> During the life of the process, Memory Consumption by CouchDB keeps on 
> growing and memory is not released when CouchDB has finished working.
> So if the memory consumption by CouchDB was 60% at the time process finishes, 
> memory consumption will remain 60% and not reducing. 
> Subsequently, when the process starts running again. memory consumption is 
> Maxed out and CouchDB restarts itself. This restart fails the process that I 
> am running. Looking at the Syslogs , I see Out Of Memory Error by the CouchDB 
> process and killing statement.
> The CouchDb process that has the issue is the "beam.smp" of Erlang.
> At this point, I have tried upgrading the memory of the server to see if this 
> resolves the issue, unfortunately, the issue persists. Memory Leak is there 
> and Usage keeps on growing until CouchDB restarts/crashed.
> I also have tried running garbage collection from Erlang command 
> (erlang:garbage_collect().) line but it didn't do anything.
> At this point, I am out of ideas and not sure what is going on here. Any 
> input/suggestion is highly appreciated!
> Env:
> Platform: Linux (Red Hat release 6.4 (Santiago))
> CouchDB: 1.3 and have tried with 1.5 as well
> RAM: Tried with 2G, 4G, and 8G
> CPU: 2 cores
> Process:/usr/lib64/erlang/erts-5.8.5/bin/beam.smp -Bd -K true -A 4 -- -root 
> /usr/lib64/erlang



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)

Reply via email to