Dear all,

what happens if I put all data in a single table and this table become very 
huge (for example millions of rows)?

Will I have same performace problems?

Thanks.


Regards.

> 
>     Il 28 gennaio 2019 alle 17.28 Simon Slavin <slav...@bigfraud.org> ha 
> scritto:
> 
>     On 28 Jan 2019, at 4:17pm, mzz...@libero.it wrote:
> 
>         > > 
> >         when the number of the tables become huge (about 15000/20000 
> > tables) the first DataBase reading query, after Database open, is very slow 
> > (about 4sec.) while next reading operations are faster.
> > 
> >         How can I speed up?
> > 
> >     > 
>     Put all the data in the same table.
> 
>     At the moment, you pick a new table name each time you write another set 
> of data to the database. Instead of that, create just one big table, and add 
> an extra column to the columns which already exist called "dataset". In that 
> you put the string you previously used as the table name.
> 
>     SQL is not designed to have a variable number of tables in a database. 
> All the optimization is done assuming that you will have a low number of 
> tables, and rarely create or drop tables.
> 
>     Simon.
> 
>     _______________________________________________
>     sqlite-users mailing list
>     sqlite-users@mailinglists.sqlite.org
>     http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
> 
_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to