Re: [sqlite] Sqlite Concurrent access issue

2011-12-22 Thread Kees Nuyt
On Thu, 22 Dec 2011 13:49:48 +, "Black, Michael (IS)"
 wrote:

> You're probably seeing disk thrashing.
>
> Try increasing your database cache size:
>
> .pragma cache_size XX

Make that 
PRAGMA cache_size=xx;

> It's in kilobytes.

Almost correct. Cache_size is in number of database pages. 
So, if page_size is 1024, you're right.

> Try and make it as big as your database if you can.
> In other words, cache the whole thing.
>
[snip]

-- 
Regards,

Kees Nuyt

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Sqlite Concurrent access issue

2011-12-22 Thread Black, Michael (IS)
You're probably seeing disk thrashing.

Try increasing your database cache size:

.pragma cache_size XX

It's in kilobytes.  Try and make it as big as your database if you can.  In 
other words, cache the whole thing.



Secondly, you may want to try using the FTS3/FTS4 search capability if your CSV 
values don't change much.  I think that's going to be quite a bit faster for 
what you're trying to do.

Put your 0.2M values into their own table and search that using FTS3/4.



So you do something like this:



create table mytable(c text);
insert into mytable values('345');
insert into mytable values('999');
create virtual table csv using fts3(t text);
insert into csv values('123,234,345,456,567');
select * from mytable where (select count(*) from csv where csv.t match 
mytable.c) > 1;



I get the '345' entry from this.







Michael D. Black

Senior Scientist

Advanced Analytics Directorate

Advanced GEOINT Solutions Operating Unit

Northrop Grumman Information Systems


From: sqlite-users-boun...@sqlite.org [sqlite-users-boun...@sqlite.org] on 
behalf of Virparia, Chetan J (GE Energy) [chetan.virpa...@ge.com]
Sent: Wednesday, December 21, 2011 10:35 PM
To: sqlite-users@sqlite.org
Cc: Kurien, Jessy S (GE Energy); Singh, Rituraj (GE Energy); R, Raghuraman (GE 
Energy)
Subject: EXT :[sqlite] Sqlite Concurrent access issue



Hi,



We have been using SQLite database for a while using single threaded WCF
service for our condition based monitor solution. Since, we are facing
issues related to scalability for our next generation products due to
single threaded architecture, We have been working on using SQLite
concurrency access feature through multiple threads and enabling WCF
concurrency feature.



We did a small prototype using SQLite using ADO.NET by changing
Isolation mode = ReadCommited and executed few queries sequentially and
parallel and found huge performance improvement. However, when we
executed huge select query found that it degraded performance
drastically so it will be great help if someone help me out to figure
out right usage pattern for this scenario.



Here is the execution timing of my query



Database size : 856 MB

Query : Select * from mytable where  in ( around 0.2 million
comma separated values)

Single query execution time: 1388 MS

10 Sequential queries total execution time : 13380 MS

10 Parallel queries total execution time: 37370 MS



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Sqlite Concurrent access issue

2011-12-21 Thread Simon Slavin

On 22 Dec 2011, at 4:35am, Virparia, Chetan J (GE Energy) wrote:

> Query : Select * from mytable where  in ( around 0.2 million
> comma separated values)

Anything that requires handling .2 million things is going to be tricky.  
Optimizing this query would be done differently depending on what governs these 
values: whether they're the same each time, or completely different each time, 
or you have a few different sets of values.

Under some circumstances you could define a table, put the values into it, and 
use a JOIN as part of your SELECT.  You could even then define a VIEW and index 
the view. This would make the query extremely fast.

Under other circumstances you would just define a very long string that had all 
those values in it (possibly separated by commas), then search this string 
using LIKE.

It really depends on what you're trying to do.

Simon.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users