Currently ading 5000 at a time, will try increasing to 20 and 50K and see
what happens.

Just one table - but interesting


On 4 February 2013 18:24, Dominique Pellé <dominique.pe...@gmail.com> wrote:

> Paul Sanderson wrote:
>
> > I want to populate a large table (millions of rows) as quickly as
> possible,
> >
> > The data set will not be operated on until the table is fully populated
> and
> > if the operation fails I will be starting again. the operation is a one
> off
> > and the table will not be added to at a future date.
> >
> > What are optimisations will work best for me?
>
>
> Do you have only 1 table to populate or several tables?
>
> If you have several tables, you could consider this:
>
> * put your tables in different databases;
>
> * perform the INSERT in different processes (1 per
>   database) so tables can be populated in parallel;
>
> * wait for all processes to finish;
>
> * ATTACH all databases, so it it behaves as
>   a single database.
>
> Dominique
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>



-- 
Paul
www.sandersonforensics.com
skype: r3scue193
twitter: @sandersonforens
Tel +44 (0)1326 572786
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to