Hi,
If you run the archiving scripts often, the impact will be less.
In any event, the impact should be light, as RRR|Chive normally does not
trigger any table scans.
If a lot of records are archived ("deleted") at the same time, this might
have a small impact.
Records are merged and deleted one at a time, which minimize the impact.
The records are retrieved in chunks of 100, which is not much.
You can even set a parameter to slow things down even further, and in this
example would do a 2 second pause between each record:
nicepausetime=2
You can even decrease the 100-chunk to a lower value. For example 1 at a
time:
multientrychunksize=1
I usually do not use these settings, as impact is minimally affected
anyway. In any event the nicepausetime=1 would limit the number of
archived records to 3600 per hour...
Best Regards - Misi, RRR AB, http://www.rrr.se (ARSList MVP 2011)
Products from RRR Scandinavia (Best R.O.I. Award at WWRUG10/11):
* RRR|License - Not enough Remedy licenses? Save money by optimizing.
* RRR|Log - Performance issues or elusive bugs? Analyze your Remedy logs.
Find these products, and many free tools and utilities, at http://rrr.se.
> Thanks Misi and Jose for your reply.
>
> Misi,
>
> Changing the Filter qualification on the source form is a good idea.
>
> If I propose RRRchive as a solution for archiving,the first concern comes
> in everyone's mind will be the performance impact on the application
> server as it uses API.
>
> Can you please let me know your more suggestions for overcoming
> application slowness?
>
> Regards
> Robin
>
> _______________________________________________________________________________
> UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
> attend wwrug12 www.wwrug12.com ARSList: "Where the Answers Are"
>
_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
attend wwrug12 www.wwrug12.com ARSList: "Where the Answers Are"