-----Original Message----- From: Bharath Booshan L [mailto:[EMAIL PROTECTED] Sent: Friday, July 27, 2007 4:31 AM To: [email protected] Subject: [sqlite] Is SQLite Scalable to handle large data?
Hello, I have been using SQLIte database for one of my application (stand -alone). It contains 12 Tables and, on average, there can be around 60 - 100 thousand of rows in one table and rest contains around 2000 rows of information which is linked to former one. It is providing acceptable performance. Now we need to make this application as Client-Server type and hence the Scalability, Concurrency issues arises which were not at-most importance till now. Is SQLite feasible enough to handle huge data assuring the concurrency? Note: The application performs more SELECT operations, with limited INSERT & UPDATE operations. So the Query results retrieval time should be minimal. Please provide some suggestions. I will provide more information if needed. Regards, Bharath Booshan L. ================================================== Hello, Bharath, On the one hand, sqlite can indeed handle multi-million row tables in some cases. On the other hand, I think that many would say that your client-server application, with significant requirements for concurrency, would be one where sqlite might well be risky. Something like Postgres, MySql, or one of the conventional, commercial databases might fit your requirements much better. I note, however, that I'm far from being an expert in this -- perhaps your application is an exception. [opinions are mine, not my company's] ----------------------------------------------------------------------------- To unsubscribe, send email to [EMAIL PROTECTED] -----------------------------------------------------------------------------

