On Tue, 14 Dec 2004 12:03:01 -0700 (MST), Ara.T.Howard
<[EMAIL PROTECTED]> wrote:
> On Tue, 14 Dec 2004, Christopher Petrilli wrote:
> 
> > Has anyone had any experience in storing a million or more rows in a
> > SQLite3 database?  I've got a database that I've been building, which
> > gets 250 inserts/second, roughly, and which has about 3M rows in it.
> > At that point, the CPU load is huge.
> >
> > Note that I've got syncing turned off, because I'm willing to accept
> > the risks.
> >
> > Thoughts?
> >
> > Chris
> >
> > --
> > | Christopher Petrilli
> > | [EMAIL PROTECTED]
> 
> on linux perhaps?
> 
>    cp ./db /dev/shm && a.out /dev/shm/db && mv /dev/shm/db ./db
> 
> this will be fast.

Right, but not really workable when total DB size is in gigabytes. :-)

> are you sure it's not YOUR 'building' code which is killing the cpu?  can you
> gperf it?

Yes, my code is using under 20% of the CPU.  The rest is basically
blocked up in sqlite3 code, and kernel time.  In order to eliminate
all possibility of my code being the issue, I actually built a rig
that prebuilds 10,000 rows, and inserts them in sequence repeatedly
putting new primary keys on them as its going alone.  So the system
basically just runs in a loop doing sqlite calls.

Chris
-- 
| Christopher Petrilli
| [EMAIL PROTECTED]

Reply via email to