These are the last few lines from /var/log/couchdb/couch.log. Are there
any other log files to check?
[Wed, 27 Aug 2014 21:46:19 GMT] [info] [<0.308.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:46:20 GMT] [info] [<0.1993.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:46:24 GMT] [info] [<0.1062.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:46:30 GMT] [info] [<0.1069.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:46:51 GMT] [info] [<0.1979.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:00 GMT] [info] [<0.1975.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:01 GMT] [info] [<0.1908.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:02 GMT] [info] [<0.1992.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:02 GMT] [info] [<0.1911.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:02 GMT] [info] [<0.1068.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:09 GMT] [info] [<0.1874.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:10 GMT] [info] [<0.1955.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:10 GMT] [info] [<0.1965.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
[Wed, 27 Aug 2014 21:47:14 GMT] [info] [<0.1946.0>] 172.17.6.85 - -
'POST' /ironcushion/_bulk_docs 201
Scrolling back I don't notice anything else except some 200 statuses
when I checked futon for updates.
On 14-08-28 11:42 PM, Robert Samuel Newson wrote:
Do you have the logs for couchdb?
This sounds more like you’ve run out of file descriptors or sockets, that sort
of thing, but I can’t tell without logs.
B.
On 28 Aug 2014, at 23:10, Ilion Blaze <[email protected]> wrote:
I'm working on an app that makes use of CouchDB to store orchestration
configurations for OpenStack. Before we started putting things in production we
decided to run some stress testing with Iron Cushion. I ran this in three
environments - my own desktop (which ended up being the most powerful), a
virtual server that I have for development (weakest), and a virtual server that
is part of one of our staging environments (mid).
I basically used the settings used in the examples provided in the Iron Cushion
documentation, the important ones being that I was using 100 connections to
insert 1000 documents each 20 times during the bulk insert phase. This had no
problem running on my home system, but the other two would cause the CouchDB
server to crash every time. I toyed with the settings and the only one that
seemed to affect this was changing the number of documents per bulk insert. At
10 (100 connections * 10 documents per * 20) the tests passed in all
environments. At 100 documents per, the tests occasionally passed in my
personal dev server (weakest) and always on my home system, but would
consistently crash on the staging server.
After trying this several times I deleted the document store I'd been using on
the staging server and created a new empty one. I ran the original (100x100x20)
test. It managed to insert ~1.5 million documents before dying. (It should have
completed at 2 million).
The CouchDB log only shows the successful 201 requests for the inserts. There's
no indication of an error or any message about it dying. I have the output on
pastebin at: http://pastebin.com/pF8AXceY . In summary I'm getting IOExceptions
(Connection reset by peer) and ClosedChannelExceptions. This to me sounds like
time out issues. Is there anything that would cause CouchDB to shut down after
a number of time outs?
Here's some info on the systems:
My Home system:
Processor: AMD Phenom(tm) II X4 955 Processor 3.2GHz. Memory: 16GB Disk Space:
195GB available
Development server:
Processor: Intel(R) Xeon(R) CPU E5310 @ 1.60GHz Memory: 4GB Disk Space: 2.2GB
available.
Staging server:
Processor: Intel(R) Xeon(R) CPU E5310 @ 1.60GHz Memory: 8GB Disk Space: 10GB
available