If you do what you are talking about, you are going to have a different
problem - no way to search for requests put in for someone last week
based on who they are this week - the PPL ID will be different.  And
don't tell me your users can search on name accurately - 12,000 of the
128,000 names in my CTM:People are not unique, and Middle name only
helps with 2,000 of those.  You will no doubt have over 90,000 names in
a million that are not unique. I also doubt that the Customer's
Incidents tablefield in the Incident form will reflect any previous
incidents for any one individual, and you certainly won't be able to
connect them to CIs in the CMDB.  You have to have a common, stable
unique key.
 
My CTM:People is being updated from PeopleSoft via LDAP using AIE 7.1
and a staging form like you say you have.  The key field is workforce id
from People Soft, but it could just as easily have been our Login ID
(also unique on both systems like workforce id) so maybe you can have
them populate the source data with login, or another unique value.  Is
the email address in the source data unique, for example.  Personally, I
would do anything to avoid having to reconstitute the entire CTM:People
table weekly, because the filter processing time you are seeing is
unavoidable if you want all of the appropriate customer data set in the
ITSM 7 tables, with them related to the correct customer company(ies)
and having whatever access permissions they need.  I would even settle
for the sequential record id from the source system - in a previous LDAP
to Remedy implementation we used that as a linking value for lack of
anything more reliable.

Christopher Strauss, Ph.D.
Remedy Database Administrator
University of North Texas Computing Center
http://remedy.unt.edu/ 

  _____  

From: Action Request System discussion list(ARSList)
[mailto:[EMAIL PROTECTED] On Behalf Of William Rentfrow
Sent: Monday, October 15, 2007 12:35 PM
To: arslist@ARSLIST.ORG
Subject: CTM:People large data imports


** 
On my current project we will have to import about 1,000,000 CTM:People
records every week (give or take).  These will be "new" records every
time because the only unique identifier we have from the external
systems is an identifier like SSN that we can't put into our system
(that's a policy decision).  We're checking on ways to do updates but
it's still messy due to the item below.
 
The actual import is going to be automatic and should follow this
process:
 
1.) Escalation deletes all relevant CTM:People records
2.) File is automatically put on the Remedy server with people
information
3.) Escalation fires a Run Process that imports the data into a staging
form.
4.) Escalation fires and sets a flag field to "Process Now" on all
records in the staging form
5.) Filter on "modify" on the staging form pushes to CTM:People
 
Here's the problem - we're using decent hardware running on Solaris 9,
etc - yet the push to CTM:People takes 1.5 seconds per record due to all
of the validation that takes place.
 
I have to do 1,000,000 records a week - and there are only 604,800
seconds in a week....also, it appears using direct SQL loads won't work
due to all of the back end set fields (I can get around all of those
except the GUID - which could be set by another escalation.....and then
we are back at this same problem).
 
If you've done large data imports to CTM:People I'd really like to know
how you went about loading the data efficiently.
 
William Rentfrow, Principal Consultant
[EMAIL PROTECTED]
C 701-306-6157
O 952-432-0227
 
__20060125_______________________This posting was submitted with HTML in
it___ 

_______________________________________________________________________________
UNSUBSCRIBE or access ARSlist Archives at www.arslist.org ARSlist:"Where the 
Answers Are"

Reply via email to