Tony Firshman writes:
> <>
> Actually I suspect I would still be using QL archive had it been more 
> robust and supported large *separate* indexes.
>
>   
I played extensively with Archive in the early years. Still use it 
almost daily for my address database.

Reliability:

I always copy my databases to ramdisk before working on them, for 
increased speed and to offer some protection against corrupting the 
original file. On making additions or changes i follow with an export / 
close / kill ram copy / import new copy / close / copy new original 
database back to disk / and finally open the new ram copy again. This 
leaves me with a clean copy of the database plus an export version on disk.

Current systems are fast enough to allow this to be done without 
irritation or too much waiting, depending on the size of databses, of 
course. Mines about 400 records. Ive not had a corrupted databse since 
implementing this method years ago, although prior to that I had 
frequent problems with corruption and "open" databases that had to be 
patched before they could be used again.

Indexing:

You CAN use "large *separate* indexes" in Archive. Just make the index a 
separate file containing a single numeric field, with as many records, 
N, as the number of records in the target database. The numbers 0..N-1 
represent absolute pointers to the corresponding record ( as given by 
recnum() ) in the database. You dont have to alter the original database 
in any way; it should work with any existing database.

I  implemented a indirect binary sort/search routine using this method. 
As the hardware at the time was too slow to do anything useful with it, 
I never developed it any further. With Append, Insert and Delete plus 
Quicksort it could be quite usable.

Of course its all really quite pointless, but it can be done! ;o)

Per



_______________________________________________
QL-Users Mailing List
http://www.q-v-d.demon.co.uk/smsqe.htm

Reply via email to