Hi all, We have a problem where our es nodes will fail with an out of memory error from Linux (note, not Java). Our es processes are configured with a fixed amount of heap (60% of total RAM - just as in in the elasticsearch chef cookbook).
So, something is consuming all of the memory available to Linux. Is there any other memory that ES can use? Does it lock OS cache or buffer memory so that it can't be released? If it opens lots of files does it use up too much RAM? Is it doing off-heap allocation? (I'm pretty sure the answer is no to the last). We're struggling to find the exact memory resource being used up. For the record. this is ES 1.1.0 on CentOS 6.4 running in VMWare. Thanks! Edward -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ab6421e3-89a1-409f-b89b-f09ca5bc9551%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
