: there are not many OOM stack details printed in the solr log file, it's
: just saying No enough memory, and it's killed by oom.sh(solr's script).
not many isn't the same as none ... can you tell us *ANYTHING* about what
the logs look like? ... as i said: it's not just the details of the
Thanks Hoss and Shawn for helping.
there are not many OOM stack details printed in the solr log file, it's
just saying No enough memory, and it's killed by oom.sh(solr's script).
My question(issue) is not it's OOM or not, the issue is why JVM memory
usage keeps growing up but never going
: Is the matter to use the config file ? I am using custom config instead
: of _default, my config is from solr 8.6.2 with custom solrconfig.xml
Well, it depends on what's *IN* the custom config ... maybe you are using
some built in functionality that has a bug but didn't get triggered by my
On 1/27/2021 9:00 PM, Luke wrote:
it's killed by OOME exception. The problem is that I just created empty
collections and the Solr JVM keeps growing and never goes down. there is no
data at all. at the beginning, I set Xxm=6G, then 10G, now 15G, Solr 8.7
always use all of them and it will be
Thanks Chris,
Is the matter to use the config file ? I am using custom config instead of
_default, my config is from solr 8.6.2 with custom solrconfig.xml
Derrick
Sent from my iPhone
> On Jan 28, 2021, at 2:48 PM, Chris Hostetter wrote:
>
>
> FWIW, I just tried using 8.7.0 to run:
>
FWIW, I just tried using 8.7.0 to run:
bin/solr -m 200m -e cloud -noprompt
And then setup the following bash one liner to poll the heap metrics...
while : ; do date; echo "node 8989" && (curl -sS
http://localhost:8983/solr/admin/metrics | grep memory.heap); echo "node 7574"
&& (curl
: Hi, I am using solr 8.7.0, centos 7, java 8.
:
: I just created a few collections and no data, memory keeps growing but
: never go down, until I got OOM and solr is killed
Are you usinga custom config set, or just the _default configs?
if you start up this single node with something like
and here is GC log when I create collection(just create collection, nothing
else)
{Heap before GC invocations=1530 (full 412):
garbage-first heap total 10485760K, used 10483431K [0x00054000,
0x000540405000, 0x0007c000)
region size 4096K, 0 young (0K), 0 survivors (0K)
Mike,
No, it's not docker. it is just one solr node(service) which connects to
external zookeeper, the below is a JVM setting and memory usage.
There are 25 collections which have a few 2000 documents totally. I am
wondering why solr uses so much memory.
Are you running these in docker containers?
Also, I’m assuming this is a typo but just in case the setting is Xmx :)
Can you share the OOM stack trace? It’s not always running out of memory,
sometimes Java throws OOM for file handles or threads.
Mike
On Wed, Jan 27, 2021 at 10:00 PM Luke
Shawn,
it's killed by OOME exception. The problem is that I just created empty
collections and the Solr JVM keeps growing and never goes down. there is no
data at all. at the beginning, I set Xxm=6G, then 10G, now 15G, Solr 8.7
always use all of them and it will be killed by oom.sh once jvm usage
On 1/27/2021 5:08 PM, Luke Oak wrote:
I just created a few collections and no data, memory keeps growing but never go
down, until I got OOM and solr is killed
Any reason?
Was Solr killed by the operating system's oom killer or did the death
start with a Java OutOfMemoryError exception?
If
Hi, I am using solr 8.7.0, centos 7, java 8.
I just created a few collections and no data, memory keeps growing but never go
down, until I got OOM and solr is killed
Any reason?
Thanks
Sent from my iPhone
13 matches
Mail list logo