On Oracle I would create an external table refering to the file, and
then copy records from this external table using std sql.

A quick test with a file with 25592(20 columns) records, this was
"imported" into an ordrinary oracle table in just 3,5 seconds.

--
Jarl


On 11/16/06, Webster, Basil <[EMAIL PROTECTED]> wrote:
**



Hi Kelly,



I am not sure if I'm completely off track here, but it sounds like you need
to produce a CSV file for an integration into some other app. If it is, then
my story below will make sense and I agree with Rick.



We have had to do an integration to SAP where the customer did not want to
purchase the approved connector. So we created a staging db with view forms
pointing to them which we push to and get data from. They pass files to us
which are collected by a middle ware app and placed in a specific folder on
the server. The middle ware then triggers a SQL DTS package which reads the
file and imports it into the relevant table on the staging db. We then use
AR System to collect the data and bring it into the ARSystem db for use by
the customer service desk. The push works in the same way, just in reverse.
We use text files but DTS can also do CSV. DTS can also be scheduled.



Bit of a story but the point I wanted to make is that you can do it all on a
db level and it's infinitely quicker. We too have flags that the DTS package
uses to identify the records that it must take and once done, it flags the
records as completed. I'm not sure of your setup but from the above you can
see we are using SQL.



Hope this helps.






Regards,



Basil Webster



 ________________________________


From: Action Request System discussion list(ARSList)
[mailto:[EMAIL PROTECTED] On Behalf Of Rick Cook
 Sent: 15 November 2006 17:25
 To: arslist@ARSLIST.ORG
 Subject: Re: Large transactions




**


Why not deal with this at the database level, which is so much more
efficient?





Rick



On 11/15/06, Terry Bootsma <[EMAIL PROTECTED]> wrote:

I have seen this issue many times, whether it has invoked
 via an escalation, filter, filter-side table loop or
 whatever.

 The only "scalable", and "reliable" method that I could
 come up with was to write an external ARSPerl script that
 was called via a scheduling tool (Unix or windows depends
 upon which tool you would use) and would query all entries
 that needed to be modified and modify each of them
 individually, keeping track of the total number to be
 modified and handling exceptions if they occurred along
 the way.

 HTH

 Terry

 On Wed, 15 Nov 2006 08:00:07 -0700
 Chris Woyton <[EMAIL PROTECTED]> wrote:
 > I've hit this wall many times, and how you handle it is
 >largely dependant on
 > where the bottleneck lies.
 >
 > Have you logged the transaction to see if your DB is
 >hanging on the query
 > for the View Form, or your workflow during Filter
 >processing, or...?
 >
 > Chris Woyton
 >
 > -----Original Message-----
 >From: Action Request System discussion list(ARSList)
 > [mailto:arslist@ARSLIST.ORG ]On Behalf Of Kelly Logan
 > Sent: Wednesday, November 15, 2006 7:43 AM
 > To: arslist@ARSLIST.ORG
 > Subject: Large transactions
 >
 >
 > Hello All,
 >
 > I am looking at alternatives to a current integration
 >process that
 > uses external .CSV files and Remedy Import called by
 >runmacro.  (We
 > have had some issues with file permissions and
 >runmacro.)
 >
 > I have set up a View form to a table with the data and
 >added some
 > display only fields to it for translation and workflow,
 >the principal
 > one being a zTmpProcess field (Yes/No).  I have one
 >filter that pushes
 > a value of "Yes" to this field on the form for all
 >entries (~40,000),
 > and a set of filters that fire off of Modify to
 >translate some of the
 > information (set other temp fields to mapped values)
 >before pushing it
 > to SHR:People.
 >
 > The process works fine when I run it manually for 1,000
 >or so entries,
 > but when I let the workflow kick off a change to all
 >40K, I get a
 > database timeout.  I'm considering some chunking
 >strategies like using
 > an SQL statement to set another flag and loop through,
 >processing
 > 1,000 at a time, but I thought I'd check to see if
 >anyone else has
 > done something similar, or if there's a simpler strategy
 >that may be
 > eluding me at the moment.
 >
 > Thanks in advance for your time,
 >
 > Kelly Logan
 > Cybernetic Solutions, Inc.
 >
 >
____________________________________________________________________________
 > ___
 > UNSUBSCRIBE or access ARSlist Archives at
 > www.arslist.org ARSlist:"Where the
 > Answers Are"

__20060125_______________________This posting was submitted
with HTML in it___
 __20060125_______________________This posting was
submitted with HTML in it___

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org ARSlist:"Where the Answers 
Are"

Reply via email to