Just out of curiosity, and to learn. I have seen some mails here
about databases containing like 30 million or more records. I have
never worked with a database like that. The largest table I have is
like 90.000 records. So my question is, how does one work with such
a database without doing things like I do them currently.
Currently, I have an Object that takes an sql command, queries, and
then read all into a dictionary containing individual objects.
90.000 records, it does take time to read on, and create objects. I
would think that reading in the whole shebang would cause out of
memory or something bad happening, having millions of objects in
memory. At least it would take an awful long time.
Hope someone can help me with this one. I do not care how some
databases do this, I have heard very nice things about valentina
database, but I am asking as a more formal computer science approach.
Trausti
_______________________________________________
Unsubscribe or switch delivery mode:
<http://www.realsoftware.com/support/listmanager/>
Search the archives of this list here:
<http://support.realsoftware.com/listarchives/lists.html>