Hi Luke, Yes I am creating new client objects for each of my tasks.
Please see this github issuse against the python client for some background as to why. https://github.com/basho/riak-python-client/issues/497 Basicaly I ran into issues with concurrency when processes are forked. I might experiment with using process ids as keys to access a process specific riak client in forked child ? Regards Steven Luke Bakken <[email protected]> writes: > Hi Steven, > > At this point I suspect you're using the Python client in such a way > that too many connections are being created. Are you re-using the > RiakClient object or repeatedly creating new ones? Can you provide any > code that reproduces your issue? > > -- > Luke Bakken > Engineer > [email protected] > > > On Tue, Jan 31, 2017 at 7:47 PM, Steven Joseph <[email protected]> wrote: >> Hi Luke, >> >> Here's the output of >> >> $ sysctl fs.file-max >> >> fs.file-max = 20000500 >> >> Regards >> >> Steven >> >> On Wed, Feb 1, 2017 at 9:30 AM Luke Bakken <[email protected]> wrote: >>> >>> Hi Steven, >>> >>> What is the output of this command on your systems? >>> >>> $ sysctl fs.file-max >>> >>> Mine is: >>> >>> fs.file-max = 1620211 >>> >>> -- >>> Luke Bakken >>> Engineer >>> [email protected] >>> >>> >>> On Tue, Jan 31, 2017 at 12:22 PM, Steven Joseph <[email protected]> >>> wrote: >>> > Hi Shaun, >>> > >>> > Im having this issue again, this time I have captured the system limits, >>> > while riak is still crashing. >>> > >>> > Please note lsof and prlimit outputs at bottom. _______________________________________________ riak-users mailing list [email protected] http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
