I'm looking towards Valentina (http://www.paradigmasoft.com) for a
relational database solution for MC. However you need to get the data from
the format its in, to the format suitable for Valentina import.

A solution I've use in the past is to index the records into a file
structure depending on the key, or keys.

A converter would be written to split the data into directories and
subdirectories so that smaller files exist in defined locations.

eg to search on a surname "Davis" MC could look into C:/d/a/v/info.dat
which should contain a relatively small file with names such as Davis,
Davies, Davison, Davidson etc. and therefore much easier and quicker to
handle.

Other issues writing to this file structure 'database' and/or how often you
need to convert the original/updated data.

Gary Rathbone


on 9/21/00 8:48 AM, David Bovill at [EMAIL PROTECTED] wrote:

> If I have large amounts of XML data (say 30-100MB), which I need to parse.
> The first thought was to load it into an array, at start-up and then get the
> keys, and loop through each key (using repeat for each line) - testing to
> see if there is a match and returning the value if there is.
> 
> However I figure that this would nearly double the amount of memory that I
> would need and the full index of keys would be almost as large as the entire
> index. As there is no syntax for referring to elements in an associative
> array by numerical index (ie get the first, second etc), what would be the
> fastest, and most memory efficient technique? Should I use an external
> database like "Pandora" or whatever the name beginning with P I am searching
> for is? maybe i need my own relational database -:)
> 
> 
> Archives: http://www.mail-archive.com/metacard%40lists.best.com/
> Info: http://www.xworlds.com/metacard/mailinglist.htm
> Please send bug reports to <[EMAIL PROTECTED]>, not this list.
> 
> 


Archives: http://www.mail-archive.com/metacard%40lists.best.com/
Info: http://www.xworlds.com/metacard/mailinglist.htm
Please send bug reports to <[EMAIL PROTECTED]>, not this list.

Reply via email to