On Mar 28, 2018, at 3:28 AM, rene <[email protected]> wrote:
> 
> My main goal is currently to reduce the startup time.

That’s not what I understood from your first post.  Are you moving the 
goalposts, or is indexed read of stored data not actually the primary problem?

> 1. parsing time is nearly the same between "nlohmann::json::parse(..)" and
> "insert into data values(json(..))”

That’s good to know, but not too surprising.  It just means one particular C 
parser and one particular C++ parser happen to be about the same speed.

> insert into arrayname select
> key,json_extract(value,'$.name'),json_extract(value,'$.id') from data,
> json_each(data.json,'$.arrayname’)

I’m suggesting that you don’t use SQLite’s JSON features at all.  Use this 
other C++ JSON parser you have, then construct INSERT queries for each row from 
the parsed JSON data.

Also, be sure to use prepared queries and bound parameters for something like 
this.  Don’t rebuild the SQL query each time:

   https://www3.sqlite.org/c3ref/bind_blob.html

Not only will it be faster, it’s also safer.
   
> Necessary indexes are created after the insert statement.

Good.

> The database is ":memory:" and all commands are inside an transaction.

Also good.
_______________________________________________
sqlite-users mailing list
[email protected]
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to