I ask because I know appending to a 1 G file takes a lot longer (in computer 
time) than appending to a 1 M file.  I was wondering if anyone was aware of a 
practical limit?

Anne Ramey

E-mail correspondence to and from this address may be subject to the North 
Carolina Public Records Law and may be disclosed to third parties only by an 
authorized State Official.

From: Action Request System discussion list(ARSList) 
[mailto:arsl...@arslist.org] On Behalf Of Lyle Taylor
Sent: Wednesday, April 21, 2010 12:09 PM
To: arslist@ARSLIST.ORG
Subject: Re: Log size and server performance

**
Well, this isn't a definitive answer by any means, but my suspicion would be 
that the log file size should be pretty much irrelevant from a performance 
perspective, since it is just appending to the existing file, which is a quick 
operation.  The more important point is that if you're getting that much 
logging output, just having logging on at all is probably impacting performance 
on the server.  So, if the performance of the system seems acceptable with 
logging turned on, you should be able to let it run as long as you want, at 
least until you either meet you maximum file size or fill up the file system 
you're logging to without any additional performance impact due to the size of 
the log files.  Now, how to do something useful with such large files is 
another question...

Lyle

From: Action Request System discussion list(ARSList) 
[mailto:arsl...@arslist.org] On Behalf Of Ramey, Anne
Sent: Wednesday, April 21, 2010 9:49 AM
To: arslist@ARSLIST.ORG
Subject: Log size and server performance

**
We are looking at capturing more effective logging to try and catch some 
interrmittent problems in production that we can't seem to re-produce in test.  
The problem is that the arfilter log on our server that runs escalations is 
currently 50M and contains about 2 minutes worth of information.  This is, 
obviously, because of the notifications, but I'm curious as to what point I can 
increase my log file sizes before I start to see a perfomance hit.  Any 
ideas/experiences?

ITSM 7.0.03 P9
ARS 7.1 P6
Linux
Oracle

It looks like 100M would catch a 1/2 hour of information or longer in all logs 
except the arfilter (but we have to set all of the log files to the same size). 
 500M might get us a 1/2 hour in the filter log, but the other logs will be 
unnecessarily big and I'm wondering if having all of the logs that size could 
cause server response time to slow?

Anne Ramey

_attend WWRUG10 www.wwrug.com ARSlist: "Where the Answers Are"_


NOTICE: This email message is for the sole use of the intended recipient(s) and 
may contain confidential and privileged information. Any unauthorized review, 
use, disclosure or distribution is prohibited. If you are not the intended 
recipient, please contact the sender by reply email and destroy all copies of 
the original message.

_attend WWRUG10 www.wwrug.com ARSlist: "Where the Answers Are"_

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
attend wwrug10 www.wwrug.com ARSlist: "Where the Answers Are"

Reply via email to