All:

SETUP:

Volunteers perform twice-daily property surveys in a large Association
property; recording observations to 40+ possible fields on hand
recorders; transcribing to Excel spreadsheet with over 95K rows spanning
five years. A subset of columns reliably receive data on each row; but
input to the majority of columns is highly variable.

This spreadsheet logically represents a 'view'; a concatenation of eight
normalized tables in the R:Base schema we've cooked up.

A defined set of post-processing steps is performed on the Excel data
within Excel before ....

...the Excel data is successfully 'gateway'd' into a single R:Base (v8.0
latest release) table without difficulty. Rudimentary '.rmd' file
inquiries at the R:Prompt satisfy immediate needs, but R:Base has
created a thirst for more. It's application time. I need to eliminate
the Excel step and give the property manager an interface to mash on.

QUESTION:

Would I best: a) Use a variable form to enter a single row of
observations, performing post-processing of variable values into
normalized tables on a per-row basis in an On-Exit EEP, or: b) Use
dbedit controls to enter daily observations into an 'intermediate
holding' table for cursor-managed, batched post-processing into the
database, after which the holding table is cleared for re-use?

All appreciation,

bruce chitiea
safesectors, inc.


Reply via email to