Re: Solr objects consuming more GC (Garbage collector) on our application
Well, this is a user's list, not a paid support channel. People here volunteer their time/expertise. First of all, Solr 4.2 is very old. From what you're showing, you've simply grown too big for the server and are running into memory issues. Your choices are: 1> get a bigger machine and allocate more space to the JVM 2> use SolrCloud and split the index into 2 shards. 3> use docValues for all fields that are used for sorting, grouping or faceting (although I frankly don't remember if docValues were available in 4.2). Second, your stack traces reference SolrNET, which is a separate project. Not many people on this list cah nelp with that. Best, Erick On Mon, Jun 25, 2018 at 3:49 AM, Jagdeeshwar S wrote: > Can you please update on this? > > > > *From:* Jagdeeshwar S [mailto:jagdeeshw...@revalsys.com] > *Sent:* 22 June 2018 10:41 > *To:* 'solr-user@lucene.apache.org' > *Cc:* 'Raj Samala' > *Subject:* Solr objects consuming more GC (Garbage collector) on our > application > > > > Hi Support, > > > > We are using Solr 4.2.0 version for one of our ecommerce application where > we are storing entire our catalogue / products. > > > > Application developed in ASP.Net 4.5.2 and hosted in IIS 8.5 > > > > Ecommerce application flow is > > > >1. Home page >2. Product Listing page >3. Product Detail page >4. Search products listing page >5. Cart flow (Add to cart, login and payment) > > > > In above flow, We are using Solr in all first 4 steps where users come and > browse for the products and will calling the DB from 5th step. > > > > We are using the solr from last 4 years but recently we have encountered a > High CPU for servers. > > > > We have captured logs in the same time and found that Solr objects are > consuming more memory (GC). > > > > Below are the log details . Can you please help us to identify the issue. > > > > From the below analysis its pretty clear that the high cpu issue is due to > the garbage collection being triggered very frequently due to the HIGH > allocation rate from within your application. > > > > The objects whose allocations are on the higher side is mentioned below. > They are all rooted to the above highlighted *SolrNet* component which is > highlighted in GREEN. > > > > > > > > 0:105> lmvm SolrNet > > Browse full module list > > start end module name > > 004c`57d5 004c`57db SolrNet(deferred) > > Image path: SolrNet.dll > > Image name: SolrNet.dll > > Browse all global symbols functions data > > Using CLR debugging support for all symbols > > Has CLR image header, track-debug-data flag not set > > Timestamp:Tue Apr 16 03:52:53 2013 (516C7DBD) > > CheckSum: > > ImageSize:0006 > > File version: 0.4.0.2002 > > Product version: 0.4.0.2002 > > File flags: 0 (Mask 3F) > > File OS: 4 Unknown Win32 > > File type:2.0 Dll > > File date:. > > Translations: .04b0 > > ProductName: SolrNet > > InternalName: SolrNet.dll > > OriginalFilename: SolrNet.dll > > ProductVersion: 0.4.0.2002 > > FileVersion: 0.4.0.2002 > > FileDescription: SolrNet > > LegalCopyright: Copyright Mauricio Scheffer 2007-2013 > > Comments: SolrNet > > > > CPU utilization: 100% > > Worker Thread: Total: 53 Running: 18 Idle: 35 MaxLimit: 800 MinLimit: 8 > > Work Request in Queue: 0 > > -- > > Number of Timers: 2 > > -- > > Completion Port Thread:Total: 4 Free: 4 MaxFree: 16 CurrentLimit: 4 > MaxLimit: 800 MinLimit: 200 > > > > > > Top 10 threads which are consuming HIGH CPU cycles are below: > > > > Showing top 10 threads > > Thread ID User Time > > == > >58 | 0 days 0:00:26.812 > >64 | 0 days 0:00:23.750 > >55 | 0 days 0:00:23.718 > >75 | 0 days 0:00:22.546 > >47 | 0 days 0:00:21.875 > >46 | 0 days 0:00:21.625 > >63 | 0 days 0:00:18.953 > >22 | 0 days 0:00:18.921 > >24 | 0 days 0:00:18.453 > >28 | 0 days 0:00:18.359 > > == > > Thread ID User Time > > > > > > Taking one of the random thread from above, I could see the below > callstack: > > > > 0:064> kL > > # Child-SP RetAddr Call Site > > 00 004c`5ea1ab38 7ffa`057e1118 ntdll!ZwWaitForSingleObject+0xa > > 01 004c`5ea1ab40 7ff9`fdc07a1f KERNELBASE! > WaitForSingleObjectEx+0x94 > > 02 004c`5ea1abe0 7ff9`fdc079d7 clr!CLREventWaitHelper2+0x3c > > 03 004c`5ea1ac20 7ff9`fdc07958 clr!CLREventWaitHelper+0x1f > > 04 004c`5ea1ac80 7ff9`fdc14c2d clr!CLREventBase::WaitEx+0x7c > > 05 (Inline Function) ` clr!CLREventBase::Wait+ > 0x`fffa63f1 > > 06 004c`5ea1ad10 7ff9`fdc14ef4
RE: Solr objects consuming more GC (Garbage collector) on our application
Can you please update on this? From: Jagdeeshwar S [mailto:jagdeeshw...@revalsys.com] Sent: 22 June 2018 10:41 To: 'solr-user@lucene.apache.org' Cc: 'Raj Samala' Subject: Solr objects consuming more GC (Garbage collector) on our application Hi Support, We are using Solr 4.2.0 version for one of our ecommerce application where we are storing entire our catalogue / products. Application developed in ASP.Net 4.5.2 and hosted in IIS 8.5 Ecommerce application flow is 1. Home page 2. Product Listing page 3. Product Detail page 4. Search products listing page 5. Cart flow (Add to cart, login and payment) In above flow, We are using Solr in all first 4 steps where users come and browse for the products and will calling the DB from 5th step. We are using the solr from last 4 years but recently we have encountered a High CPU for servers. We have captured logs in the same time and found that Solr objects are consuming more memory (GC). Below are the log details . Can you please help us to identify the issue. >From the below analysis its pretty clear that the high cpu issue is due to the garbage collection being triggered very frequently due to the HIGH allocation rate from within your application. The objects whose allocations are on the higher side is mentioned below. They are all rooted to the above highlighted SolrNet component which is highlighted in GREEN. 0:105> lmvm SolrNet Browse full module list start end module name 004c`57d5 004c`57db SolrNet(deferred) Image path: SolrNet.dll Image name: SolrNet.dll Browse all global symbols functions data Using CLR debugging support for all symbols Has CLR image header, track-debug-data flag not set Timestamp:Tue Apr 16 03:52:53 2013 (516C7DBD) CheckSum: ImageSize:0006 File version: 0.4.0.2002 Product version: 0.4.0.2002 File flags: 0 (Mask 3F) File OS: 4 Unknown Win32 File type:2.0 Dll File date:. Translations: .04b0 ProductName: SolrNet InternalName: SolrNet.dll OriginalFilename: SolrNet.dll ProductVersion: 0.4.0.2002 FileVersion: 0.4.0.2002 FileDescription: SolrNet LegalCopyright: Copyright Mauricio Scheffer 2007-2013 Comments: SolrNet CPU utilization: 100% Worker Thread: Total: 53 Running: 18 Idle: 35 MaxLimit: 800 MinLimit: 8 Work Request in Queue: 0 -- Number of Timers: 2 -- Completion Port Thread:Total: 4 Free: 4 MaxFree: 16 CurrentLimit: 4 MaxLimit: 800 MinLimit: 200 Top 10 threads which are consuming HIGH CPU cycles are below: Showing top 10 threads Thread ID User Time == 58 | 0 days 0:00:26.812 64 | 0 days 0:00:23.750 55 | 0 days 0:00:23.718 75 | 0 days 0:00:22.546 47 | 0 days 0:00:21.875 46 | 0 days 0:00:21.625 63 | 0 days 0:00:18.953 22 | 0 days 0:00:18.921 24 | 0 days 0:00:18.453 28 | 0 days 0:00:18.359 == Thread ID User Time Taking one of the random thread from above, I could see the below callstack: 0:064> kL # Child-SP RetAddr Call Site 00 004c`5ea1ab38 7ffa`057e1118 ntdll!ZwWaitForSingleObject+0xa 01 004c`5ea1ab40 7ff9`fdc07a1f KERNELBASE!WaitForSingleObjectEx+0x94 02 004c`5ea1abe0 7ff9`fdc079d7 clr!CLREventWaitHelper2+0x3c 03 004c`5ea1ac20 7ff9`fdc07958 clr!CLREventWaitHelper+0x1f 04 004c`5ea1ac80 7ff9`fdc14c2d clr!CLREventBase::WaitEx+0x7c 05 (Inline Function) ` clr!CLREventBase::Wait+0x`fffa63f1 06 004c`5ea1ad10 7ff9`fdc14ef4 clr!SVR::gc_heap::wait_for_gc_done+0x66 07 004c`5ea1ad40 7ff9`fdc06709 clr!SVR::GCHeap::GarbageCollectGeneration+0x108 08 (Inline Function) ` clr!SVR::gc_heap::try_allocate_more_space+0x535 09 (Inline Function) ` clr!SVR::gc_heap::allocate_more_space+0x54a 0a (Inline Function) ` clr!SVR::gc_heap::allocate+0x5a1 0b (Inline Function) ` clr!SVR::GCHeap::Alloc+0x601 0c (Inline Function) ` clr!Alloc+0x961 0d (Inline Function) ` clr!AllocateObject+0x9e3 0e 004c`5ea1ada0 7ff9`a0190d0a clr!JIT_New+0xac9 0f 004c`5ea1b1e0 7ff9`a018fb43 SolrNet!SolrNet.Impl.FieldParsers.AggregateFieldParser.CanHandleType(System. Type)+0x3a 10 004c`5ea1b220 7ff9`a018f9c6 SolrNet!SolrNet.Impl.DocumentPropertyVisitors.RegularDocumentVisitor.Visit(S ystem.Object, System.String, System.Xml.Linq.XElement)+0xe3 11 004c`5ea1b290 7ff9`a018f7d1
Re: Solr objects consuming more GC (Garbage collector) on our application
SolrNet is not part of the Apache Solr project and isn't supported by them. I thought the SolrNet project was abandoned, but it seems like work's been completed on it recently. You might have better luck asking over at the SolrNet git page: https://github.com/SolrNet/SolrNet. Best, Chris On Fri, Jun 22, 2018 at 11:51 AM Jagdeeshwar S wrote: > Hi Support, > > > > We are using Solr 4.2.0 version for one of our ecommerce application where > we are storing entire our catalogue / products. > > > > Application developed in ASP.Net 4.5.2 and hosted in IIS 8.5 > > > > Ecommerce application flow is > > > >1. Home page >2. Product Listing page >3. Product Detail page >4. Search products listing page >5. Cart flow (Add to cart, login and payment) > > > > In above flow, We are using Solr in all first 4 steps where users come and > browse for the products and will calling the DB from 5th step. > > > > We are using the solr from last 4 years but recently we have encountered a > High CPU for servers. > > > > We have captured logs in the same time and found that Solr objects are > consuming more memory (GC). > > > > Below are the log details . Can you please help us to identify the issue. > > > > From the below analysis its pretty clear that the high cpu issue is due to > the garbage collection being triggered very frequently due to the HIGH > allocation rate from within your application. > > > > The objects whose allocations are on the higher side is mentioned below. > They are all rooted to the above highlighted *SolrNet* component which is > highlighted in GREEN. > > > > > > > > 0:105> lmvm SolrNet > > Browse full module list > > start end module name > > 004c`57d5 004c`57db SolrNet(deferred) > > Image path: SolrNet.dll > > Image name: SolrNet.dll > > Browse all global symbols functions data > > Using CLR debugging support for all symbols > > Has CLR image header, track-debug-data flag not set > > Timestamp:Tue Apr 16 03:52:53 2013 (516C7DBD) > > CheckSum: > > ImageSize:0006 > > File version: 0.4.0.2002 > > Product version: 0.4.0.2002 > > File flags: 0 (Mask 3F) > > File OS: 4 Unknown Win32 > > File type:2.0 Dll > > File date:. > > Translations: .04b0 > > ProductName: SolrNet > > InternalName: SolrNet.dll > > OriginalFilename: SolrNet.dll > > ProductVersion: 0.4.0.2002 > > FileVersion: 0.4.0.2002 > > FileDescription: SolrNet > > LegalCopyright: Copyright Mauricio Scheffer 2007-2013 > > Comments: SolrNet > > > > CPU utilization: 100% > > Worker Thread: Total: 53 Running: 18 Idle: 35 MaxLimit: 800 MinLimit: 8 > > Work Request in Queue: 0 > > -- > > Number of Timers: 2 > > -- > > Completion Port Thread:Total: 4 Free: 4 MaxFree: 16 CurrentLimit: 4 > MaxLimit: 800 MinLimit: 200 > > > > > > Top 10 threads which are consuming HIGH CPU cycles are below: > > > > Showing top 10 threads > > Thread ID User Time > > == > >58 | 0 days 0:00:26.812 > >64 | 0 days 0:00:23.750 > >55 | 0 days 0:00:23.718 > >75 | 0 days 0:00:22.546 > >47 | 0 days 0:00:21.875 > >46 | 0 days 0:00:21.625 > >63 | 0 days 0:00:18.953 > >22 | 0 days 0:00:18.921 > >24 | 0 days 0:00:18.453 > >28 | 0 days 0:00:18.359 > > == > > Thread ID User Time > > > > > > Taking one of the random thread from above, I could see the below > callstack: > > > > 0:064> kL > > # Child-SP RetAddr Call Site > > 00 004c`5ea1ab38 7ffa`057e1118 ntdll!ZwWaitForSingleObject+0xa > > 01 004c`5ea1ab40 7ff9`fdc07a1f > KERNELBASE!WaitForSingleObjectEx+0x94 > > 02 004c`5ea1abe0 7ff9`fdc079d7 clr!CLREventWaitHelper2+0x3c > > 03 004c`5ea1ac20 7ff9`fdc07958 clr!CLREventWaitHelper+0x1f > > 04 004c`5ea1ac80 7ff9`fdc14c2d clr!CLREventBase::WaitEx+0x7c > > 05 (Inline Function) ` > clr!CLREventBase::Wait+0x`fffa63f1 > > 06 004c`5ea1ad10 7ff9`fdc14ef4 clr!SVR::gc_heap::wait_for_gc_done > +0x66 > > 07 004c`5ea1ad40 7ff9`fdc06709 > clr!SVR::GCHeap::GarbageCollectGeneration+0x108 > > 08 (Inline Function) ` clr!SVR::gc_heap:: > try_allocate_more_space+0x535 > > 09 (Inline Function) ` > clr!SVR::gc_heap::allocate_more_space+0x54a > > 0a (Inline Function) ` clr!SVR::gc_heap::allocate+0x5a1 > > 0b (Inline Function) ` clr!SVR::GCHeap::Alloc+0x601 > > 0c (Inline Function) ` clr!Alloc+0x961 > > 0d (Inline Function) ` clr!AllocateObject+0x9e3 > > 0e 004c`5ea1ada0 7ff9`a0190d0a clr!JIT_New+0xac9 > > 0f 004c`5ea1b1e0