On Mar 2, 2006, at 10:44 AM, Trausti Thor Johannsson wrote:

Just out of curiosity, and to learn. I have seen some mails here about databases containing like 30 million or more records. I have never worked with a database like that. The largest table I have is like 90.000 records. So my question is, how does one work with such a database without doing things like I do them currently.

Currently, I have an Object that takes an sql command, queries, and then read all into a dictionary containing individual objects. 90.000 records, it does take time to read on, and create objects. I would think that reading in the whole shebang would cause out of memory or something bad happening, having millions of objects in memory. At least it would take an awful long time.

Hope someone can help me with this one. I do not care how some databases do this, I have heard very nice things about valentina database, but I am asking as a more formal computer science approach.

I can give you one method. It would be similar to what WebObjects does. What you would do is use a server side cursor if available on your DB server. If not you could fudge your own. Essentially you select on your table the objects you want. Then you would create a buffer of objects that were needed immediately maybe 100 or a thousand or whatever. Then in your object loader you'd build a mechanism for fetching the object from the server's cursor as needed in batches of the buffer size. Of course this wouldn't work well for every scenario, but for most scenarios this works quite well. Who can look at 90,000 objects at once anyway.

Kevin


_______________________________________________
Unsubscribe or switch delivery mode:
<http://www.realsoftware.com/support/listmanager/>

Search the archives of this list here:
<http://support.realsoftware.com/listarchives/lists.html>

Reply via email to