Hi Lisa,

A flat file on a local machine is not necessary faster than a good SQL server: 
a structured data may be accessed by the server with many threads, running on 
many processor, while the server itself could decide (automatically, or with 
help) to keep some tables in the memory. It is scalable and can run on multiple 
processors or even individual computers. The server can be equipped with huge 
amount of RAM and the price will be split over many users. In contrast, a flat 
file is read linearly in a single thread and interfere with other I/O 
operations on the same computer.

Now, keeping the whole data in memory does not look indeed as a good idea, but 
these days memory is no more an object - you may add RAM to your computer as 
much as your programs need, considering that this need is real and you have no 
other choice.

Of course, loading all that data will take time, but it might happen when the 
program starts with an accepted penalty, or the program (like a server) may run 
for days without reloading, or this process may run on a separate tread (one or 
many, when again you need the support of a server) while the user is doing 
something else. Same for saving the data back - a watch-dog thread may be 
activated every couple of minutes and make sure that the data is secured on the 
server, without interfering with the user's activity.

Other issue: the in-memory data may be mutable or immutable, and when the 
change of the data is needed, there are theories and good practices about how 
to solve conflicts, and this is a  de-facto standard in today's programming. 
However, there are a lot of situations when this data is just informative - 
nobody needs to tamper with it on local workstations (from wheatear forecast to 
a car body-parts inventory.)

Horia



--------------------------------------------------------------------------------
From: [email protected] [mailto:[EMAIL PROTECTED] On Behalf Of 
Lisa Westveld
Sent: Sunday, February 19, 2006 7:26 AM
To: [email protected]
Subject: [delphi-en] Re: Loading complete data set into memory


I don't think that anyone is crazy enough to load a whole, huge 
dataset into memory. People try to avoid this because you'd be 
spending a lot of time just to get the data in memory. And we're 
talking about a delay range in seconds or perhaps even minutes.

I do know that by using a memory mapped flat file, that you can use a 
whole file as if it's loaded into memory. It's a bit complex 
technique which I'm not really familiar with but apparantly it does 
provide a huge speed (since it doesn't really load all the data in 
memory) combined with a huge amount of records.

But in your case you have dozens of SQL queries running to get all 
the data. So I think it would be better to prevent the user from 
expanding the whole tree with a shortcut or whatever. Or otherwise 
ask them if they really want to expand all, since it will be time-
consuming. And if they still want to expand all, try to show a 
progressbar at the same time.

And consider this: not even Google immediately returns all 500.000 
results when you search for something. At the most, you get 100 
records and an option to just to a next page. It would just be too 
slow for the user. And worse, once you have all those thousands of 
records in memory, you will have to deal with changes in the 
database. What if you have two users active at the same time, both 
looking at the same data. One makes a modification, but since the 
other has all data already in memory they won't notice the change 
since you never return to the database to get the 'fresh' data.

In general, it's a very bad idea...

Greetings, Lisa.




-----------------------------------------------------
Home page: http://groups.yahoo.com/group/delphi-en/
To unsubscribe: [EMAIL PROTECTED] 
Yahoo! Groups Links

<*> To visit your group on the web, go to:
    http://groups.yahoo.com/group/delphi-en/

<*> To unsubscribe from this group, send an email to:
    [EMAIL PROTECTED]

<*> Your use of Yahoo! Groups is subject to:
    http://docs.yahoo.com/info/terms/
 


Reply via email to