Why not deal with this at the database level, which is so much more efficient?
Rick
On 11/15/06, Terry Bootsma <[EMAIL PROTECTED]> wrote:
__20060125_______________________This posting was submitted with HTML in it___
I have seen this issue many times, whether it has invoked
via an escalation, filter, filter-side table loop or
whatever.
The only "scalable", and "reliable" method that I could
come up with was to write an external ARSPerl script that
was called via a scheduling tool (Unix or windows depends
upon which tool you would use) and would query all entries
that needed to be modified and modify each of them
individually, keeping track of the total number to be
modified and handling exceptions if they occurred along
the way.
HTH
Terry
On Wed, 15 Nov 2006 08:00:07 -0700
Chris Woyton <[EMAIL PROTECTED]> wrote:
> I've hit this wall many times, and how you handle it is
>largely dependant on
> where the bottleneck lies.
>
> Have you logged the transaction to see if your DB is
>hanging on the query
> for the View Form, or your workflow during Filter
>processing, or...?
>
> Chris Woyton
>
> -----Original Message-----
>From: Action Request System discussion list(ARSList)
> [mailto:[email protected] ]On Behalf Of Kelly Logan
> Sent: Wednesday, November 15, 2006 7:43 AM
> To: [email protected]
> Subject: Large transactions
>
>
> Hello All,
>
> I am looking at alternatives to a current integration
>process that
> uses external .CSV files and Remedy Import called by
>runmacro. (We
> have had some issues with file permissions and
>runmacro.)
>
> I have set up a View form to a table with the data and
>added some
> display only fields to it for translation and workflow,
>the principal
> one being a zTmpProcess field (Yes/No). I have one
>filter that pushes
> a value of "Yes" to this field on the form for all
>entries (~40,000),
> and a set of filters that fire off of Modify to
>translate some of the
> information (set other temp fields to mapped values)
>before pushing it
> to SHR:People.
>
> The process works fine when I run it manually for 1,000
>or so entries,
> but when I let the workflow kick off a change to all
>40K, I get a
> database timeout. I'm considering some chunking
>strategies like using
> an SQL statement to set another flag and loop through,
>processing
> 1,000 at a time, but I thought I'd check to see if
>anyone else has
> done something similar, or if there's a simpler strategy
>that may be
> eluding me at the moment.
>
> Thanks in advance for your time,
>
> Kelly Logan
> Cybernetic Solutions, Inc.
>
> ____________________________________________________________________________
> ___
> UNSUBSCRIBE or access ARSlist Archives at
> www.arslist.org ARSlist:"Where the
> Answers Are"

