I do a lot of importing of data from hospitals.   
If your indexes are pretty simple, importing data into the tables while other 
people are using the system shouldn't be much of a performance hit.  
All I do is convert to csv and append it to the table.   You may need an 
interim database if you'd like
To be more efficient, do a sql insert, if performance is really bad.
If your indexes are complex, then I would wait til close of business to append. 
   

      From: Ken Dibble <[email protected]>
 To: [email protected] 
 Sent: Thursday, June 22, 2017 10:48 AM
 Subject: Bulk Imports While Multi-User System is in Use
   
Hi folks,

Having never attempted this before, I am seeking any guidance you can offer.

I'm using VFP tables across a LAN, with VFP 9 SP 1. The tables are on 
a virtualized Windows 2012 file server that has gargantuan memory and 
HDD resources available. The workstations are Windows 7 Ultimate 
32-bit with processors in the 3.5 GHz range and 4 GB of RAM.

My framework does not use the RAD features of VFP; no data 
environments, no control sources, no buffering (I actually wrote my 
own "buffering"/conflict resolution system), no auto-incrementing 
columns. The only time I manually lock anything is when my integer 
surrogate key-creation program is generating a new key--and I only 
lock the table that stores the last-used key value.

I do use a database container so that the default value methods of 
the PK fields of my tables can call the key generation program. When 
users create new records, I use APPEND, not INSERT. The system was 
deliberately designed to divorce database access from business rules 
and GUI (because initially, long ago, I thought I might actually 
embark on writing my own database server).

I understand that there is brief automatic locking, however, whenever 
VFP writes to a table (or, rather, when the server OS, in its 
infinite wisdom, deigns to carry out VFP's request to write to the table).

I've been given the task of importing data from a spreadsheet into my 
multi-user system. I have not done any testing on this but it is 
possible that the import process, run every week or every two weeks, 
will create thousands of records that will be "children" of a few 
hundred "parent" records.

The system is in near constant use by somewhere around a half-dozen 
people, and much larger numbers use it occasionally.

Some of the half-dozen intensive users are quite likely to be 
accessing the parent records at the time the bulk import process is 
creating the child records.

I am concerned about creating massive access conflicts leading to 
race conditions, freezing of the system, and/or data corruption.

Should I just block everybody from saving any data while the import 
process runs? (I don't know how long it will take, but I do envision 
a lot of error-checking and logging on each row of the spreadsheet to 
prevent bad data from getting into the system.) Or are there simple 
things I can do to prevent issues ("simple" means, no major changes 
to the framework or the database)? Or is this something I don't 
really need to worry about?

Thanks very much for any suggestions.

Ken Dibble
www.stic-cil.org


[excessive quoting removed by server]

_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://mail.leafe.com/mailman/listinfo/profox
OT-free version of this list: http://mail.leafe.com/mailman/listinfo/profoxtech
Searchable Archive: http://leafe.com/archives/search/profox
This message: 
http://leafe.com/archives/byMID/profox/[email protected]
** All postings, unless explicitly stated otherwise, are the opinions of the 
author, and do not constitute legal or medical advice. This statement is added 
to the messages for those lawyers who are too stupid to see the obvious.

Reply via email to