Hello,

Is there any known structural performance issue working with a schema made of 
about 100 tables, about 80 foreign keys constraints, and some indexes in 
addition to those implicit of the primary keys and foreign keys. In my book it 
does not qualify as a complex schema, some tables would have 30 to 40 columns 
and 4 or 5 tables are candidates for a moderate number of rows (rarely more 
than 1 million), while one of the tables could receive about 10 millions rows 
after some years of data collection (so again nothing really fancy).

Does sqlite have to reparse the schema text often to execute the queries? Or is 
the schema somehow translated internally to a, stored, digested ('compiled') 
format, to ease its working?

The application which would use this schema is a server-side application (quite 
along the lines described in http://sqlite.org/whentouse.html).  We have 
started experimentation and things look very good, excellent should I say, so 
the above question is more about which details to supervise, which traps to 
avoid.  I'm pretty sure there are people here with valuable background with 
similar datasets.

Thanks,
-- 
Meilleures salutations, Met vriendelijke groeten, Best Regards,
Olivier Mascia, integral.be/om


Reply via email to