We found the same problem moving ~ 100,000 records.

If the DSO process was 64 bit then we'd be OK.

We now have a windows task that checks the memory being used by the DSO
process and kills it if it gets > 2GB.

This works for us. 

-----Original Message-----
From: Action Request System discussion list(ARSList)
[mailto:[email protected]] On Behalf Of pritch
Sent: 06 February 2013 19:31
To: [email protected]
Subject: Re: DSO and large number of records

The way I've done that in the past (large amount of data) is 

-  set up the filter with the DSO action for the going forward updates
-  disable DSO (allow those pending records to be queued up)
-  do an export of the table from the source system and import into the
target system.
-  Restart DSO from the source system to allow the additional updates to go
across.

There may be a bit of redundancy in that you may export / import records
that are also slated for transfer in the queue, but it doesn't hurt
anything.

----- Original Message -----
From: "Frederick W Grooms" <[email protected]>
To: [email protected]
Sent: Wednesday, February 6, 2013 2:17:40 PM
Subject: Re: DSO and large number of records

For a 1 time process (like to set up archiving) you can export the data
needed and use the Import tool to add records to the DSO Pending form.  It
is easy enough to see exactly what the data needs to be in the DSO Pending
form.

After the archiving is set up you can use the escalation to continually copy
the data over.

Fred

-----Original Message-----
From: Action Request System discussion list(ARSList)
[mailto:[email protected]] On Behalf Of Kulkarni, Vikrant
Sent: Wednesday, February 06, 2013 12:56 PM
To: [email protected]
Subject: Re: DSO and large number of records

hi Abhijeet,

I am using DSO action in an escalation as I want to copy data to an archive
server. And no not using private queue for DSO. 
If the escalation has more than 100K records to process via a DSO action I
see the malloc error.

And by 1 go I mean without splitting data in smaller chunks by using run if
qualifications.

Thanks,
Vikrant

-----Original Message-----
From: Action Request System discussion list(ARSList)
[mailto:[email protected]] On Behalf Of Gadgil, Abhijeet
Sent: Thursday, February 07, 2013 12:23 AM
To: [email protected]
Subject: Re: DSO and large number of records

Are you moving these after an update (modify) is done on each record?
Also what do you mean by one go?
For filter based actions (like the ones on which DSO would trigger) events
would really be transactions being executed on a queue.
Are you using private queue for DSO?

Abhijeet


On 07-Feb-2013, at 12:16 AM, "Vikrant" <[email protected]> wrote:

> hi List,
> 
> I would like to know what is the max number of records anyone of you have
tried to move via DSO at one go. I have about 3 million records in one form
and I need to move it using DSO. Even with 100000 number of records I see
the malloc error on the server while the escalation tries to search for the
records that needs to be moved to pending form. 
> 
> The few config values that I can think of and have set as below:
> 
> 1) Max retrive in get list call = 5000
> 2) Max number of records for DSO to process = 5000
> 
> Is there anything else I need to check or do better?
> 
> How can i get rid of the Malloc failed error which I am guessing is due to
large number of records in the form.
> 
> We are on ARS 7.5 P4, ITSM 7.6 SRM 7.6 SLM 7.6 all on windows and SQL as
remote DB.
> 
> Any help is appreciated. 
> 
> Thanks,
> Vikrant

____________________________________________________________________________
___
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org "Where the Answers
Are, and have been for 20 years"

____________________________________________________________________________
___
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org "Where the Answers
Are, and have been for 20 years"

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org
"Where the Answers Are, and have been for 20 years"

Reply via email to