I think we ended setting ours to 10000 rows as well and don't think we've had 
any issues since.

From: Action Request System discussion list(ARSList) 
[mailto:arslist@ARSLIST.ORG] On Behalf Of Ortega, Jesus A
Sent: Wednesday, August 10, 2011 12:28 PM
To: arslist@ARSLIST.ORG
Subject: Re: java.lang.OutOfMemoryError: Java heap space.

**
I have been having this problem for two months and have a critical ticket 
opened with BMC. We were on MT 7.5.3.  It took almost two months to talk to a 
US based engineer and get the proper support. They had me do a number of things:
- Patch the mid-tier to 7.5.7,
- Decrease perm size to 128 and max to 256 (I had it 256m -512m)
- Turned on persistent caching, this caused some instability so I turned it off
-  set the following in Tomcat's configuration page under the Java options tab: 
-XX:+HeapDumpOnOutOfMemoryError
-  there was a setting in the mid-tier config.properties files that is set 
incorrectly out of the box and has to do with ecache: 
arsystem.ehcache.referenceMaxElementsInMemoryWeight.formFields=28.577. It 
should be set to 8.577

After all of this, we still had crashes every Monday and Wednesday mornings. In 
the end, drum roll please, it was the AR Server Administration setting for "Max 
Entries Returned by Getlist" in the Server Information\configuration tab. It 
was set for 0, which would allow a user to do a search on the mid-tier that 
would allow thousands of rows to be returned. It turned out that some users 
would do a search on Asset Management and would request 20000 - 60000 rows and 
consume 450mb of heap space in Java. Since we are setting our heap space to 
1024 and our server runs at an average of 700mb, the 450mb query would cause 
the java heap errors. I set maximum on Getlist to 1000 rows, but this caused AR 
System to start crashing. So, I bumped it up to 10000 rows and it seems to be 
working okay and Tomcat has not crashed in a week. Check on the Max Entries 
Returned by Getlist and see if that helps. I ask this to the other admins, what 
is the happy medium for Getlist that will not cause AR System to crash? I am 
still going through trial and error, but 10000 rows seem to be a okay for now.


________________________________
From: Action Request System discussion list(ARSList) 
[mailto:arslist@ARSLIST.ORG] On Behalf Of Michele Mizell
Sent: Monday, July 25, 2011 10:22 AM
To: arslist@ARSLIST.ORG
Subject: java.lang.OutOfMemoryError: Java heap space.

All,
We are intermittently receiving "java.lang.OutOfMemoryError: Java heap space" 
in our midtier logs.
We have 2 Load Balanced servers which are running Apache Tomcat 5.5.28 and IIS 
6.
Our min and max memory pool for Java is set to 1024 per BMC's recommendation.
These are dedicated servers :
Windows 2003 Standard x64 Edition SP2

Our heap memory is constantly maxed out.  Has anyone experienced this error in 
your environment?  BMC support has not been helpful in determining the cause 
and solution.




Sincerely,



Michele Mizell

Application Maintenance

Software Engineer 1

JCPenney Co., Inc.

Ext.11088

mmi...@jcpenney.com


The information transmitted is intended only for the person or entity to which 
it is addressed and
may contain confidential and/or privileged material. If the reader of this 
message is not the intended
recipient, you are hereby notified that your access is unauthorized, and any 
review, dissemination,
distribution or copying of this message including any attachments is strictly 
prohibited. If you are not
the intended recipient, please contact the sender and delete the material from 
any computer.



Information contained in this email is subject to the disclaimer found by 
clicking on the following link: http://www.lyondellbasell.com/Footer/Disclaimer/
_attend WWRUG11 www.wwrug.com ARSlist: "Where the Answers Are"_ _attend WWRUG11 
www.wwrug.com ARSlist: "Where the Answers Are"_

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
attend wwrug11 www.wwrug.com ARSList: "Where the Answers Are"

Reply via email to