On 1/19/2018 8:54 AM, Pouliot, Scott wrote:
I do have a ticket in with our systems team to up the file handlers since I am seeing the "Too 
many files open" error on occasion on our prod servers.  Is this the setting you're referring 
to?  Found we were set to to 1024 using the "Ulimit" command.

No, but that often needs increasing too.  I think you need to increase the process limit even if that's not the cause of this particular problem.

Sounds like you're running on Linux, though ulimit is probably available on other platforms too.

If it's Linux, generally you must increase both the number of processes and the open file limit in /etc/security/limits.conf.  Trying to use the ulimit command generally doesn't work because the kernel has hard limits configured that ulimit can't budge.  If it's not Linux, then you'll need to consult with an expert in the OS you're running.

Again, assuming Linux, in the output of "ulimit -a" the value I'm talking about is the "-u" value -- "max user processes".  The following is the additions that I typically make to /etc/security/limits.conf, to increase both the open file limit and the process limit for the solr user:

solr            hard    nproc   61440
solr            soft    nproc   40960

solr            hard    nofile  65535
solr            soft    nofile  49151

Are you running into problems where Solr just disappears?  I would expect a process limit to generate OutOfMemoryError exceptions. When Solr is started with the included shell script, unless it's running with the foreground option, OOME will kill the Solr process.  We have issues to bring the OOME death option to running in the foreground, as well as when running on Windows.

Thanks,
Shawn

Reply via email to