Watching Windows 7 Resource Monitor (launched from a button on Windows Task
Manager) - I see that sqlite - directly - or through Windows - generates
quite a bit of activity on a temp file located on
C:\users\owner\AppData\Local\Temp - especially when running large joins,
etc. There are large read
Thanks ! I will experiment.
I have Windows 7 (64 bit) with 4GB RAM - so I suspect that 2GB is the most
I can specify anyway.
Is there a 32-bit versus 64-bit official releases of the command line shell?
On Fri, Feb 3, 2012 at 10:48 AM, Alexey Pechnikov wrote:
> You can
You can try page size 8192 and cache size 128000.
Note: is required vacuuming after changing page size on existing database.
Inserting/updating big indexed tables may be slow. In last SQLite versions
a index creating performance is optimized and so indexing fully populated
table is a good idea
> Marcus - thanks. I will experiment with those 2 PRAGMAs.
>
> Meanwhile - I was trying to update some columns in a table with 130
> million
> rows and got this error -
>
> "Error: out of memory"
Did you get that after you increased the cache ?
Strange, I never got that during my huge DB tests.
Marcus - thanks. I will experiment with those 2 PRAGMAs.
Meanwhile - I was trying to update some columns in a table with 130 million
rows and got this error -
"Error: out of memory"
I am not sure why. I thought the "UPDATE" just writes out new rows and
maybe a journal of the old rows for
> Given how clever and compelling Sqlite is - I am testing how it scales to
> tables in the 100GB / 200 million row range. This is for a strictly "read
> only" application - but first the tables must be populated in a one time
> process. As is often the case with Big Data - the data is a little
Thanks ! Good ideas. SSD is still a little exotic price-wise, but closing
and defragging is easy to try.
Coming to think of it - your suggestion to break the process into batches -
is probably the way to go. By placing each table in its own DB - not only
are they smaller and easier to manager -
On 2 Feb 2012, at 11:31pm, Udi Karni wrote:
> Given how clever and compelling Sqlite is - I am testing how it scales to
> tables in the 100GB / 200 million row range. This is for a strictly "read
> only" application - but first the tables must be populated in a one time
> process. As is often
Given how clever and compelling Sqlite is - I am testing how it scales to
tables in the 100GB / 200 million row range. This is for a strictly "read
only" application - but first the tables must be populated in a one time
process. As is often the case with Big Data - the data is a little dirty -
so
9 matches
Mail list logo