> -----Original Message-----
> From: sqlite-users [mailto:sqlite-users-boun...@mailinglists.sqlite.org] On
> Behalf Of Gerry Snyder
> Sent: Wednesday, May 17, 2017 9:14 AM
> To: SQLite mailing list <sqlite-users@mailinglists.sqlite.org>
> Subject: Re: [sqlite] Bulk load strategy
> 
> If the updates pertain just to the 150k rows immediately preceding them,
> could you put each 150k chunk into its own table, and then do a join when
> accessing the data? Or even a merge at that point? Could be a lot faster.

Hi Gerry,
The updates would refer to past entries, however I have no idea when and
how often they appear. The complicating factor is that future records in the
source data may reflect past changes introduced and so I cannot defer them.

I certainly can alter the strategy, I am just not clear on exactly what you 
suggest?

Thanks!
jlc
_______________________________________________
sqlite-users mailing list
sqlite-users@mailinglists.sqlite.org
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to