Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-05 Thread Udi Karni
Watching Windows 7 Resource Monitor (launched from a button on Windows Task Manager) - I see that sqlite - directly - or through Windows - generates quite a bit of activity on a temp file located on C:\users\owner\AppData\Local\Temp - especially when running large joins, etc. There are large read

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-03 Thread Udi Karni
Thanks ! I will experiment. I have Windows 7 (64 bit) with 4GB RAM - so I suspect that 2GB is the most I can specify anyway. Is there a 32-bit versus 64-bit official releases of the command line shell? On Fri, Feb 3, 2012 at 10:48 AM, Alexey Pechnikov wrote: > You can

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-03 Thread Alexey Pechnikov
You can try page size 8192 and cache size 128000. Note: is required vacuuming after changing page size on existing database. Inserting/updating big indexed tables may be slow. In last SQLite versions a index creating performance is optimized and so indexing fully populated table is a good idea

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-02 Thread Marcus Grimm
> Marcus - thanks. I will experiment with those 2 PRAGMAs. > > Meanwhile - I was trying to update some columns in a table with 130 > million > rows and got this error - > > "Error: out of memory" Did you get that after you increased the cache ? Strange, I never got that during my huge DB tests.

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-02 Thread Udi Karni
Marcus - thanks. I will experiment with those 2 PRAGMAs. Meanwhile - I was trying to update some columns in a table with 130 million rows and got this error - "Error: out of memory" I am not sure why. I thought the "UPDATE" just writes out new rows and maybe a journal of the old rows for

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-02 Thread Marcus Grimm
> Given how clever and compelling Sqlite is - I am testing how it scales to > tables in the 100GB / 200 million row range. This is for a strictly "read > only" application - but first the tables must be populated in a one time > process. As is often the case with Big Data - the data is a little

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-02 Thread Udi Karni
Thanks ! Good ideas. SSD is still a little exotic price-wise, but closing and defragging is easy to try. Coming to think of it - your suggestion to break the process into batches - is probably the way to go. By placing each table in its own DB - not only are they smaller and easier to manager -

Re: [sqlite] Speeding up Sqlite reads after DML

2012-02-02 Thread Simon Slavin
On 2 Feb 2012, at 11:31pm, Udi Karni wrote: > Given how clever and compelling Sqlite is - I am testing how it scales to > tables in the 100GB / 200 million row range. This is for a strictly "read > only" application - but first the tables must be populated in a one time > process. As is often

[sqlite] Speeding up Sqlite reads after DML

2012-02-02 Thread Udi Karni
Given how clever and compelling Sqlite is - I am testing how it scales to tables in the 100GB / 200 million row range. This is for a strictly "read only" application - but first the tables must be populated in a one time process. As is often the case with Big Data - the data is a little dirty - so