Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-25 Thread Kees Nuyt
On Mon, 25 Aug 2008 12:26:18 -0700, you wrote:

>Interesting, I just tried that in my test application and Dennis's and I
>get access violations during the vacuum command execution when trying to
>resize the pages from 1k to 4k with my database or Dennis's test
>database.

I just used the command line tool, exactly as shown.
Platform: MS Windows Vista Ultimate SP1.

My previous test was on a small database and SQLite 3.6.0.
The run below is with SQLite 3.6.1 and a bigger database:

\research>copy \data\opt\fos\repo\fossil tmp

\research>dir tmp\fossil

2008-08-25  15:37 4.583.424 fossil

\research>%sqlite% "tmp/fossil"
SQLite version 3.6.1
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> PRAGMA page_size;
1024
sqlite> select count(*) from blob;
3374
sqlite> PRAGMA page_size=8192;
sqlite> VACUUM;
sqlite> PRAGMA page_size;
8192
sqlite> select count(*) from blob;
3374
sqlite> .q

\research>%sqlite% "tmp/fossil"
SQLite version 3.6.1
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> PRAGMA page_size;
8192
sqlite> select count(*) from blob;
3374
sqlite> .q

\research>dir tmp\fossil

2008-08-25  23:43 4.775.936 fossil

So, it works.
Perhaps there is an esential difference between your test
program and the command line tool?
-- 
  (  Kees Nuyt
  )
c[_]
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-25 Thread Dennis Cote
Brown, Daniel wrote:
> Interesting, I just tried that in my test application and Dennis's and I
> get access violations during the vacuum command execution when trying to
> resize the pages from 1k to 4k with my database or Dennis's test
> database.
> 

Daniel,

I have found that sqlite works correctly if your main database is a 
file, but crashes when you try to vacuum with a :memory: database as 
your main database.

The vacuum command does not work on attached databases, so you must open 
the file to be vacuumed as your main database.

This crash is a bug that should probably be reported at 
http://www.sqlite.org/cvstrac/captcha?nxp=/cvstrac/tktnew

HTH
Denis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-25 Thread Brown, Daniel
Interesting, I just tried that in my test application and Dennis's and I
get access violations during the vacuum command execution when trying to
resize the pages from 1k to 4k with my database or Dennis's test
database.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Kees Nuyt
Sent: Saturday, August 23, 2008 8:51 AM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

On Thu, 21 Aug 2008 16:29:10 -0700, you wrote:

>How do I rebuild a database file for another page size
>or did the pragma do that already?

Use PRAGMA page_size={your_new_pagesize} immediately before
a vacuum. It will change the page size of the vacuumed
database. See:

http://www.sqlite.org/pragma.html#pragma_page_size

"As of version 3.5.8, if the page_size pragma is used
to specify a new page size just prior to running the
VACUUM command then VACUUM will change the page size
to the new value."

Demo:
sqlite_version():3.6.0
--
-- new database
PRAGMA page_size=8192;
BEGIN;
CREATE TABLE test (
x integer primary key, 
y text
);
INSERT INTO test (y) VALUES ('row1');
INSERT INTO test (y) VALUES ('row2');
COMMIT;
PRAGMA page_size;
8192
PRAGMA schema_version;
1
PRAGMA page_size=1024;
VACUUM;
PRAGMA schema_version;
2
PRAGMA page_size;
1024
-- 
  (  Kees Nuyt
  )
c[_]
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-23 Thread Kees Nuyt
On Thu, 21 Aug 2008 16:29:10 -0700, you wrote:

>How do I rebuild a database file for another page size
>or did the pragma do that already?

Use PRAGMA page_size={your_new_pagesize} immediately before
a vacuum. It will change the page size of the vacuumed
database. See:

http://www.sqlite.org/pragma.html#pragma_page_size

"As of version 3.5.8, if the page_size pragma is used
to specify a new page size just prior to running the
VACUUM command then VACUUM will change the page size
to the new value."

Demo:
sqlite_version():3.6.0
--
-- new database
PRAGMA page_size=8192;
BEGIN;
CREATE TABLE test (
x integer primary key, 
y text
);
INSERT INTO test (y) VALUES ('row1');
INSERT INTO test (y) VALUES ('row2');
COMMIT;
PRAGMA page_size;
8192
PRAGMA schema_version;
1
PRAGMA page_size=1024;
VACUUM;
PRAGMA schema_version;
2
PRAGMA page_size;
1024
-- 
  (  Kees Nuyt
  )
c[_]
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Jeffrey Becker
Ok, I couldnt find the script but heres something started.

On Fri, Aug 22, 2008 at 6:02 PM, Brown, Daniel <[EMAIL PROTECTED]> wrote:
> That script would be great :)
>
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Jeffrey Becker
> Sent: Friday, August 22, 2008 1:31 PM
> To: General Discussion of SQLite Database
> Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)
>
> At one point I wrote some vbscript to generate a table declaration and
> insert statements for a csv.  I might be able to dig it up if you dont
> mind vbscript.
>
> On Fri, Aug 22, 2008 at 1:58 PM, Brown, Daniel <[EMAIL PROTECTED]> wrote:
>> I just ran Dennis's test databases through the test application and
>> we're getting similar results:
>>1k Pages (17.4 MB) used 18102 KB High 20416 KB
>>4k Pages (12.2 MB) used 18102 KB, High 26416 KB (not sure why
>> the high is higher?)
>> My test database however with the same test application produces the
>> following:
>>1k Pages (7.46 MB) used 22735 KB, High 25138 KB.
>>
>> So it looks my issue could be data related if my test database going
>> through the same app is coming out so large, Dennis's database is
>> expanding to about 101.6% of its original size but mine is expanding
> to
>> 297.6% of its original size.  This begs the question is the 3rd party
>> tool (SQLite Analyzer) I'm using to import from an excel file causing
>> this expansion with bad data type choices?  And is there any other way
>> to import table structure and contents from xls (or csv) to sqlite?
>>
>> -Original Message-
>> From: [EMAIL PROTECTED]
>> [mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
>> Sent: Friday, August 22, 2008 7:10 AM
>> To: General Discussion of SQLite Database
>> Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)
>>
>> Brown, Daniel wrote:
>>> 2. And the other thing to try would be if anyone has a fairly meaty
>> test
>>> database they don't mind sharing that I could fling at my test
>>> application to try and rule out the data?
>>>
>>
>> Daniel,
>>
>> I can send you copies of the databases I am using for my testing, both
>> the version with the 1K page size (17.4 MB) and the one with the 4K
> page
>>
>> size (12.2 MB).
>>
>> Where would you like me to send them? The zipped versions are each
> about
>>
>> 1.3 MB in size.
>>
>> Dennis Cote
>> ___
>> sqlite-users mailing list
>> sqlite-users@sqlite.org
>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>> ___
>> sqlite-users mailing list
>> sqlite-users@sqlite.org
>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>>
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
Class FieldInfo
 
Public Name

private m_TextCount
private m_BlobCount
private m_IntCount
private m_FloatCount
private m_DateCount

Public Property Get DataType
dim t, c


t = "NULL"
c = 0

if(m_DateCount > c) then
t = "Date"
c = m_DateCount
end if

if(m_BlobCount > c) then
t = "Binary"
c = m_BlobCount 
end if 

if(m_FloatCount > c) then
t = "Real"
c = m_FloatCount
end if  

if(m_IntCount > c) then
t = "Integer"
c = m_IntCount  
end if 

if(m_TextCount > c) then
t = "Text"
c = m_TextCount 
end if

if count = 0 then
t = "Text"
end if

DataType = t
End Property


Private Sub Class_Initialize
m_TextCount = 0
m_BlobCount = 0
m_IntCount = 0
m_FloatCount = 0
m_DateCount = 0
   

Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Brown, Daniel
That script would be great :)

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Jeffrey Becker
Sent: Friday, August 22, 2008 1:31 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

At one point I wrote some vbscript to generate a table declaration and
insert statements for a csv.  I might be able to dig it up if you dont
mind vbscript.

On Fri, Aug 22, 2008 at 1:58 PM, Brown, Daniel <[EMAIL PROTECTED]> wrote:
> I just ran Dennis's test databases through the test application and
> we're getting similar results:
>1k Pages (17.4 MB) used 18102 KB High 20416 KB
>4k Pages (12.2 MB) used 18102 KB, High 26416 KB (not sure why
> the high is higher?)
> My test database however with the same test application produces the
> following:
>1k Pages (7.46 MB) used 22735 KB, High 25138 KB.
>
> So it looks my issue could be data related if my test database going
> through the same app is coming out so large, Dennis's database is
> expanding to about 101.6% of its original size but mine is expanding
to
> 297.6% of its original size.  This begs the question is the 3rd party
> tool (SQLite Analyzer) I'm using to import from an excel file causing
> this expansion with bad data type choices?  And is there any other way
> to import table structure and contents from xls (or csv) to sqlite?
>
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
> Sent: Friday, August 22, 2008 7:10 AM
> To: General Discussion of SQLite Database
> Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)
>
> Brown, Daniel wrote:
>> 2. And the other thing to try would be if anyone has a fairly meaty
> test
>> database they don't mind sharing that I could fling at my test
>> application to try and rule out the data?
>>
>
> Daniel,
>
> I can send you copies of the databases I am using for my testing, both
> the version with the 1K page size (17.4 MB) and the one with the 4K
page
>
> size (12.2 MB).
>
> Where would you like me to send them? The zipped versions are each
about
>
> 1.3 MB in size.
>
> Dennis Cote
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Jeffrey Becker
At one point I wrote some vbscript to generate a table declaration and
insert statements for a csv.  I might be able to dig it up if you dont
mind vbscript.

On Fri, Aug 22, 2008 at 1:58 PM, Brown, Daniel <[EMAIL PROTECTED]> wrote:
> I just ran Dennis's test databases through the test application and
> we're getting similar results:
>1k Pages (17.4 MB) used 18102 KB High 20416 KB
>4k Pages (12.2 MB) used 18102 KB, High 26416 KB (not sure why
> the high is higher?)
> My test database however with the same test application produces the
> following:
>1k Pages (7.46 MB) used 22735 KB, High 25138 KB.
>
> So it looks my issue could be data related if my test database going
> through the same app is coming out so large, Dennis's database is
> expanding to about 101.6% of its original size but mine is expanding to
> 297.6% of its original size.  This begs the question is the 3rd party
> tool (SQLite Analyzer) I'm using to import from an excel file causing
> this expansion with bad data type choices?  And is there any other way
> to import table structure and contents from xls (or csv) to sqlite?
>
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
> Sent: Friday, August 22, 2008 7:10 AM
> To: General Discussion of SQLite Database
> Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)
>
> Brown, Daniel wrote:
>> 2. And the other thing to try would be if anyone has a fairly meaty
> test
>> database they don't mind sharing that I could fling at my test
>> application to try and rule out the data?
>>
>
> Daniel,
>
> I can send you copies of the databases I am using for my testing, both
> the version with the 1K page size (17.4 MB) and the one with the 4K page
>
> size (12.2 MB).
>
> Where would you like me to send them? The zipped versions are each about
>
> 1.3 MB in size.
>
> Dennis Cote
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Wilson, Ron P
You can try using the command line tool to import csv data:

sqlite> .mode csv
sqlite> .import yourdata.csv yourtablename

however, if you have quoted strings with commas embedded it won't work.
You can try using any delimiter with .separator command.

RW

sqlite>select level from sqlGuruOMeter where name="Ron Wilson";
2

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Brown, Daniel
Sent: Friday, August 22, 2008 1:59 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

I just ran Dennis's test databases through the test application and
we're getting similar results:
1k Pages (17.4 MB) used 18102 KB High 20416 KB
4k Pages (12.2 MB) used 18102 KB, High 26416 KB (not sure why
the high is higher?)
My test database however with the same test application produces the
following:
1k Pages (7.46 MB) used 22735 KB, High 25138 KB.

So it looks my issue could be data related if my test database going
through the same app is coming out so large, Dennis's database is
expanding to about 101.6% of its original size but mine is expanding to
297.6% of its original size.  This begs the question is the 3rd party
tool (SQLite Analyzer) I'm using to import from an excel file causing
this expansion with bad data type choices?  And is there any other way
to import table structure and contents from xls (or csv) to sqlite?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
Sent: Friday, August 22, 2008 7:10 AM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

Brown, Daniel wrote:
> 2. And the other thing to try would be if anyone has a fairly meaty
test
> database they don't mind sharing that I could fling at my test
> application to try and rule out the data?
> 

Daniel,

I can send you copies of the databases I am using for my testing, both 
the version with the 1K page size (17.4 MB) and the one with the 4K page

size (12.2 MB).

Where would you like me to send them? The zipped versions are each about

1.3 MB in size.

Dennis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Dennis Cote
Brown, Daniel wrote:
> I just ran Dennis's test databases through the test application and
> we're getting similar results:
>   1k Pages (17.4 MB) used 18102 KB High 20416 KB
>   4k Pages (12.2 MB) used 18102 KB, High 26416 KB (not sure why
> the high is higher?)
> My test database however with the same test application produces the
> following:
>   1k Pages (7.46 MB) used 22735 KB, High 25138 KB.
> 

Its good to see you are getting the same results as me using my 
databases. That rules out your build of sqlite and the build tolls. It 
does look like your issue has to do with your data.

> So it looks my issue could be data related if my test database going
> through the same app is coming out so large, Dennis's database is
> expanding to about 101.6% of its original size but mine is expanding to
> 297.6% of its original size.  This begs the question is the 3rd party
> tool (SQLite Analyzer) I'm using to import from an excel file causing
> this expansion with bad data type choices?  And is there any other way
> to import table structure and contents from xls (or csv) to sqlite?
> 

You could write one yourself in Python using the csv reader and the 
pysqlite modules to read CSV files saved from excel and save the data 
into an sqlite database. I doubt that will change your data much though.

Can you publish the schema of your database, and some typical data? It 
may be a case of storing integers as text or something similar that is 
causing the unexpectedly large expansion.

Dennis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Brown, Daniel
I just ran Dennis's test databases through the test application and
we're getting similar results:
1k Pages (17.4 MB) used 18102 KB High 20416 KB
4k Pages (12.2 MB) used 18102 KB, High 26416 KB (not sure why
the high is higher?)
My test database however with the same test application produces the
following:
1k Pages (7.46 MB) used 22735 KB, High 25138 KB.

So it looks my issue could be data related if my test database going
through the same app is coming out so large, Dennis's database is
expanding to about 101.6% of its original size but mine is expanding to
297.6% of its original size.  This begs the question is the 3rd party
tool (SQLite Analyzer) I'm using to import from an excel file causing
this expansion with bad data type choices?  And is there any other way
to import table structure and contents from xls (or csv) to sqlite?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
Sent: Friday, August 22, 2008 7:10 AM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

Brown, Daniel wrote:
> 2. And the other thing to try would be if anyone has a fairly meaty
test
> database they don't mind sharing that I could fling at my test
> application to try and rule out the data?
> 

Daniel,

I can send you copies of the databases I am using for my testing, both 
the version with the 1K page size (17.4 MB) and the one with the 4K page

size (12.2 MB).

Where would you like me to send them? The zipped versions are each about

1.3 MB in size.

Dennis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Dennis Cote
Brown, Daniel wrote:
> Ok so after reading your feedback I tried:
> 1. "PRAGMA cache_size =10" no change in memory usage.
> 2. "PRAGMA page_size = 4096" no change in memory usage.
> 
> I'm doing both those queries (in C++) after the 'sqlite3_open(
> ":memory:", _pDataBase );' in my test but before the database file is
> attached or anything is copied or created.  The rebuilt database file is
> 7.46 MB and the memory usage I'm seeing is now: 22.20 MB with a high
> water of 24.55 MB as reported by the sqlite_memory_* functions.  
> 

I believe each database has its own cache. You need to set the cache 
size for the file database after it is attached.

   attach 'file.db' as file_db;
   pragma file_db.cache.size = 100;

In my tests I am setting the cache size for both the memory database 
(right after the open call) and the file database.

> I'm not using the amalgamation version of the pre-processed source; I'm
> using the individual source files of 3.6.1 on Windows XP with Visual
> Studio 2005.  I'm afraid I can't give you a copy of the test database as
> it's a drop from a live product, could it be using the third party
> SQLite Analyzer application to import from excel be the issue?  

I doubt it, but anything is possible.

> Are
> there any other tools from importing from a .xls to a SQLite database
> (converting each sheet to a table)?
> 

I don't know.

> I just tried a "vacuum" after I detach the database from file and that
> didn't reduce the memory usage either but it did double the high water
> mark which after reading the documentation sounds about right for making
> a temporary copy.
> 
> How do I rebuild a database file for another page size or did the pragma
> do that already?
> 

I used you test program to do that for my database. :-)

I simply changed the filename of the output database from :memory: to my 
new filename in the open call, and then executed a "pragma 
page_size=4096" immediately after the open. The rest of your code copied 
all the tables in the test database out to the new database file with 
the new page size. This database doesn't use any named (i.e, 
non-automatic) indexes or triggers, so there was nothing else to be copied.

Dennis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-22 Thread Dennis Cote
Brown, Daniel wrote:
> 2. And the other thing to try would be if anyone has a fairly meaty test
> database they don't mind sharing that I could fling at my test
> application to try and rule out the data?
> 

Daniel,

I can send you copies of the databases I am using for my testing, both 
the version with the 1K page size (17.4 MB) and the one with the 4K page 
size (12.2 MB).

Where would you like me to send them? The zipped versions are each about 
1.3 MB in size.

Dennis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Brown, Daniel
Thanks I'll try the fossil database tomorrow morning. 

I just tried creating a new test application using Visual Studios built
in C++ wizards (not our proprietary system) using Dennis's test
application source code with a copy of the 3.6.1 amalgamation and my
test database.  I'm still seeing a 7645 KB database file turn into 22735
KB with a high of 25138 KB at runtime.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Thursday, August 21, 2008 5:24 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 21, 2008, at 7:33 PM, Brown, Daniel wrote:

>  if anyone has a fairly meaty test
> database they don't mind sharing that I could fling at my test
> application...

A project repository with "fossil" is a (meaty) SQLite database.  You  
could (for example) go clone one of the various fossil projects out  
there and use that as your test database.  First download a pre- 
compiled fossil binary:

 http://www.fossil-scm.org/download.html

Then clone a repository:

 http://www.sqlite.org/experimental (~5MB)
 http://www.sqlite.org/docsrc (~2MB)
 http://www.fossil-scm.org/index.html  (~6MB)

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread D. Richard Hipp

On Aug 21, 2008, at 7:33 PM, Brown, Daniel wrote:

>  if anyone has a fairly meaty test
> database they don't mind sharing that I could fling at my test
> application...

A project repository with "fossil" is a (meaty) SQLite database.  You  
could (for example) go clone one of the various fossil projects out  
there and use that as your test database.  First download a pre- 
compiled fossil binary:

 http://www.fossil-scm.org/download.html

Then clone a repository:

 http://www.sqlite.org/experimental (~5MB)
 http://www.sqlite.org/docsrc (~2MB)
 http://www.fossil-scm.org/index.html  (~6MB)

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Brown, Daniel
1. Is anyone else on the list using Visual Studio 2005?  It would be
handy to see if they got similar results with the test application or
not, that would rule out the build environment to an extent.  
2. And the other thing to try would be if anyone has a fairly meaty test
database they don't mind sharing that I could fling at my test
application to try and rule out the data?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
Sent: Thursday, August 21, 2008 4:23 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

Nicolas Williams wrote:
> 
> I thought the DB was 9MB; forgive me for wasting your time then.  If
> it's 17.4MB then the memory usage seems a lot more reasonable.

Daniel, the OP's, database is 9 MB. I don't have his database file, but 
I do have his test code. I used a database of my own that is a similar 
size along with his test code to do my tests.

In my tests sqlite behaves as expected. Daniel is seeing much higher 
memory usage reported from sqlite itself using the same version of 
sqlite, the same test code, and the same OS.

I see a memory usage of about 18 MB for a database copied from a file 
that is 17.4 MB (1K pages) or 12.2 MB (4K pages). I get an expansion 
factor of 1.03 or 1.48.

Daniel is seeing memory usage of 22.2 MB for a database copied from a 
file that is 9 MB. Daniel gets an expansion factor of 2.47. This seems
high.

Since the major difference seems to be the database file we are copying,

I would like to repeat his test with his database file if possible. If 
not possible (perhaps the data is proprietary or personal), then it 
might make sense to see what factors effect this memory expansion ratio.

I was surprised by the magnitude of the change in the size of my 
database file by simply changing the page size. I also tried to change 
the page size used for the memory database, but that had no effect 
(Which is not what I expected, perhaps the page size pragma is ignored 
for memory databases). Changing the cache size reduced the highwater 
memory requirement, but didn't change the memory required to hold the 
database after the copy was completed.

Dennis Cote


___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Brown, Daniel
Ok so after reading your feedback I tried:
1. "PRAGMA cache_size =10" no change in memory usage.
2. "PRAGMA page_size = 4096" no change in memory usage.

I'm doing both those queries (in C++) after the 'sqlite3_open(
":memory:", _pDataBase );' in my test but before the database file is
attached or anything is copied or created.  The rebuilt database file is
7.46 MB and the memory usage I'm seeing is now: 22.20 MB with a high
water of 24.55 MB as reported by the sqlite_memory_* functions.  

I'm not using the amalgamation version of the pre-processed source; I'm
using the individual source files of 3.6.1 on Windows XP with Visual
Studio 2005.  I'm afraid I can't give you a copy of the test database as
it's a drop from a live product, could it be using the third party
SQLite Analyzer application to import from excel be the issue?  Are
there any other tools from importing from a .xls to a SQLite database
(converting each sheet to a table)?

I just tried a "vacuum" after I detach the database from file and that
didn't reduce the memory usage either but it did double the high water
mark which after reading the documentation sounds about right for making
a temporary copy.

How do I rebuild a database file for another page size or did the pragma
do that already?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Nicolas Williams
Sent: Thursday, August 21, 2008 3:13 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

On Thu, Aug 21, 2008 at 04:02:32PM -0600, Dennis Cote wrote:
> I built a copy of my test database using a 4096 byte page size and it 
> reduced the database file size from 17.5 MB to 12.2 MB. When I repeat 
> the tests using this database file I get the same 18102 KB of memory 
> used, but a slightly higher highwater mark of 26418 KB used. Again
this 
> was all done with the default page cache size.
> 
> Note, I also tried to vacuum the original file to see if there were a 
> lot of unused pages in the original 17.5 MB file. After the vacuum it 
> was reduced to only 17.4 MB, so there were very few free pages in the 
> database. This database just fits much better on the larger 4K pages.

I thought the DB was 9MB; forgive me for wasting your time then.  If
it's 17.4MB then the memory usage seems a lot more reasonable.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Dennis Cote
Nicolas Williams wrote:
> 
> I thought the DB was 9MB; forgive me for wasting your time then.  If
> it's 17.4MB then the memory usage seems a lot more reasonable.

Daniel, the OP's, database is 9 MB. I don't have his database file, but 
I do have his test code. I used a database of my own that is a similar 
size along with his test code to do my tests.

In my tests sqlite behaves as expected. Daniel is seeing much higher 
memory usage reported from sqlite itself using the same version of 
sqlite, the same test code, and the same OS.

I see a memory usage of about 18 MB for a database copied from a file 
that is 17.4 MB (1K pages) or 12.2 MB (4K pages). I get an expansion 
factor of 1.03 or 1.48.

Daniel is seeing memory usage of 22.2 MB for a database copied from a 
file that is 9 MB. Daniel gets an expansion factor of 2.47. This seems high.

Since the major difference seems to be the database file we are copying, 
I would like to repeat his test with his database file if possible. If 
not possible (perhaps the data is proprietary or personal), then it 
might make sense to see what factors effect this memory expansion ratio.

I was surprised by the magnitude of the change in the size of my 
database file by simply changing the page size. I also tried to change 
the page size used for the memory database, but that had no effect 
(Which is not what I expected, perhaps the page size pragma is ignored 
for memory databases). Changing the cache size reduced the highwater 
memory requirement, but didn't change the memory required to hold the 
database after the copy was completed.

Dennis Cote


___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Nicolas Williams
On Thu, Aug 21, 2008 at 04:02:32PM -0600, Dennis Cote wrote:
> I built a copy of my test database using a 4096 byte page size and it 
> reduced the database file size from 17.5 MB to 12.2 MB. When I repeat 
> the tests using this database file I get the same 18102 KB of memory 
> used, but a slightly higher highwater mark of 26418 KB used. Again this 
> was all done with the default page cache size.
> 
> Note, I also tried to vacuum the original file to see if there were a 
> lot of unused pages in the original 17.5 MB file. After the vacuum it 
> was reduced to only 17.4 MB, so there were very few free pages in the 
> database. This database just fits much better on the larger 4K pages.

I thought the DB was 9MB; forgive me for wasting your time then.  If
it's 17.4MB then the memory usage seems a lot more reasonable.
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Dennis Cote
Nicolas Williams wrote:
> 
> I wonder too, what does the page cache do when doing full table scans?
> If the cache has an LRU/LFU page eviction algorithm then full table
> scans should not be a big deal.  Ideally it should not allow pages read
> during a full table scan to push out other pages, but if the cache is
> cold then a full table scan just might fill the cache.
> 
> In this case we have full table scans in the process of copying on-disk
> tables to a memory DB.  And it looks like the cache is cold in this
> case.
> 
> The default cache size is 2000 pages (there's a way to persist a cache
> size).  The default page size is 1024 bytes, but it's hard to tell
> exactly what it is in the OP's case.
> 
> So it looks like the cache size should be ~20MB.  And the DB size is
> ~9MB.  The cache is plenty large enough to hold a copy of the on-disk
> DB.
> 
> So we have: 9MB will be consumed in the page cache, and 9MB will be
> consumed by the memory DB (I assume memory DB pages aren't cached).  Add
> in the overhead per-page, which seems to be .5KB, and you have
> 
> 18MB + 9MB / 2 = 22.5MB
> 
> That seems close to what the OP claimed.
> 

But my testing using the same code, but a different database file, uses 
only slightly more memory than required to hold the database file. This 
code uses the default cache size of 2000 pages.

I built a copy of my test database using a 4096 byte page size and it 
reduced the database file size from 17.5 MB to 12.2 MB. When I repeat 
the tests using this database file I get the same 18102 KB of memory 
used, but a slightly higher highwater mark of 26418 KB used. Again this 
was all done with the default page cache size.

Note, I also tried to vacuum the original file to see if there were a 
lot of unused pages in the original 17.5 MB file. After the vacuum it 
was reduced to only 17.4 MB, so there were very few free pages in the 
database. This database just fits much better on the larger 4K pages.

Changing the cache size from the default 2000 to a much smaller 100 
pages reduced the highwater mark to 18544 KB, which is only slightly 
higher than the 18102 KB of memory used after the table is built. The 
actual memory used is exactly the same (as expected since it is storing 
the same tables).

Dennis Cote



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Dennis Cote
Jeffrey Becker wrote:
> Just out of curiosity what happens if you call
> "PRAGMA page_size=4096"
> before running the import?
> 

As I expected, it has no effect. The page size pragma only effects the 
:memory: database he is copying into. The page size of the database file 
was set when it was created.

Daniel, can you run a "pragam page_size;" query on youyr database and 
let us know the results?

Dennis Cote
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Nicolas Williams
On Thu, Aug 21, 2008 at 10:32:23AM -0400, Jeffrey Becker wrote:
> Just out of curiosity what happens if you call
> "PRAGMA page_size=4096"
> before running the import?

I wonder too, what does the page cache do when doing full table scans?
If the cache has an LRU/LFU page eviction algorithm then full table
scans should not be a big deal.  Ideally it should not allow pages read
during a full table scan to push out other pages, but if the cache is
cold then a full table scan just might fill the cache.

In this case we have full table scans in the process of copying on-disk
tables to a memory DB.  And it looks like the cache is cold in this
case.

The default cache size is 2000 pages (there's a way to persist a cache
size).  The default page size is 1024 bytes, but it's hard to tell
exactly what it is in the OP's case.

So it looks like the cache size should be ~20MB.  And the DB size is
~9MB.  The cache is plenty large enough to hold a copy of the on-disk
DB.

So we have: 9MB will be consumed in the page cache, and 9MB will be
consumed by the memory DB (I assume memory DB pages aren't cached).  Add
in the overhead per-page, which seems to be .5KB, and you have

18MB + 9MB / 2 = 22.5MB

That seems close to what the OP claimed.

Perhaps to keep the memory footprint of SQLite3 low the OP should set
the cache size way down during the copy-the-DB-into-memory part of the
program using the cache_size pragma.

I could be way off-track, but, try it and see.

Nico
-- 
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Dennis Cote
Brown, Daniel wrote:
> 
> I just tried with rebuilt data from SQLite Analyzer in SQLite 3.6.1, I'm
> still seeing a memory usage that is roughly three times the size of the
> source database file, looking at your changes to my test there doesn't
> seem to be any fixes that would resolve that.
> 

No, I don't think any of my changes would have changed the behaviour of 
your program (except for getting the correct memory values displayed).

> I can see the memory being released when I close the SQLite database in
> the teardown stage of my test, so I'm fairly sure the memory is being
> used by SQLite and the built in memory profiling would seem to support
> that.  I haven't had to make any changes locally to get the PC version
> of 3.6.1 compiling so I don't think that is the issue, could it be some
> sort of configuration or library issue?  I'm building in Visual Studio
> 2005 SP1.
> 

Can you provide a copy of the database file you are using? If so they 
usually compress quite well using a zip utility.

Dennis Cote

___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-21 Thread Mihai Limbasan

Brown, Daniel wrote:

Perhaps, when is the next release due?  I'd be interested to see the
differences, if an upgrade reduces memory overhead that significantly it
would be most excellent :)

3.6.2 is (tentatively) due this Monday :)
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
Thanks Dennis,

I just tried with rebuilt data from SQLite Analyzer in SQLite 3.6.1, I'm
still seeing a memory usage that is roughly three times the size of the
source database file, looking at your changes to my test there doesn't
seem to be any fixes that would resolve that.

I can see the memory being released when I close the SQLite database in
the teardown stage of my test, so I'm fairly sure the memory is being
used by SQLite and the built in memory profiling would seem to support
that.  I haven't had to make any changes locally to get the PC version
of 3.6.1 compiling so I don't think that is the issue, could it be some
sort of configuration or library issue?  I'm building in Visual Studio
2005 SP1.

Daniel


-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Dennis Cote
Sent: Wednesday, August 20, 2008 3:45 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

Brown, Daniel wrote:
> I just upgraded to the latest version (3.6.1) of the pre-processed C
> source code from the website, running the test again gives me similar
> results of 22.2 MB used and 24.55 MB high water from the same 9 MB
file.
> Is there any way it could be the file that is causing the extra memory
> usage?  I'll try rebuilding it next.
> 

I just tried your test code with sqlite 3.6.1 amalgamation source on Win

XP and I get the following output from a 17.5 MB (18,362,368 bytes) 
database file.

 Entries.db   Used: 18049 KB   High: 20357 KB

I had to make a few changes to your code to get it to run. I have copied

the modified code below. This was built with Dev-Cpp using GCC 3.4.2.

There must be some other issue with your program that is causing the 
inflated memory usage you are seeing.

Dennis Cote




#include 
#include 
#include 

using namespace std;

int main(int argc, char *argv[])
{
sqlite3* pDataBase = NULL;
 const char* ptail = NULL;
sqlite3_stmt* pstatement = NULL;
int result = -1;
int cmdSize = 0;
const int cmdBufferSize = 1024;
char cmdBuffer[cmdBufferSize];
const char* pdatabaseName = "file_db";
const char* pfilename = argv[1];

sqlite3_open( ":memory:",  );

// create the attach command
cmdSize = sprintf( cmdBuffer, "ATTACH DATABASE '%s' AS %s",
pfilename, 
pdatabaseName );

// attach the on-disk database with ATTACH filename.db AS
filename
result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );

// You can enumerate all tables in a your on-disk-file in the
mentioned 
scenario by
// doing a "SELECT tbl_name FROM filename.sqlite_master WHERE
type = 
'table'".
cmdSize = sprintf( cmdBuffer, "SELECT tbl_name FROM
%s.sqlite_master 
WHERE type = 'table'", pdatabaseName );

// prepare the statement
result = sqlite3_prepare_v2( pDataBase, cmdBuffer, cmdSize, 
, );

while( sqlite3_step( pstatement) == SQLITE_ROW)
{
// Then do a CREATE TABLE tableName AS SELECT * FROM 
filename.tableName On each table in the file,
// thus creating an in-memory copy of the DB and having
done a select 
on each table (i.e. you'll see how  // much cache in
memory will be 
used, etc.) 

// get the table name
const unsigned char* pname = sqlite3_column_text(
pstatement, 0);

// construct the command
cmdSize = sprintf( cmdBuffer, "CREATE TABLE %s AS SELECT
* FROM 
%s.%s", pname, pdatabaseName, pname );

result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL,
NULL );
}

sqlite3_finalize(pstatement);

// detach the attached database to leave just the in memory
database
cmdSize = sprintf( cmdBuffer, "DETACH DATABASE %s",
pdatabaseName );

result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );

sqlite_int64 memHigh = sqlite3_memory_highwater(0);
sqlite_int64 memUsed = sqlite3_memory_used();

printf("%s   Used: %d KB   High: %d KB\n", pfilename, 
(int)(memUsed/1024), (int)(memHigh/1024));

sqlite3_close( pDataBase );

 system("PAUSE");
 return EXIT_SUCCESS;
}
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Dennis Cote
Brown, Daniel wrote:
> I just upgraded to the latest version (3.6.1) of the pre-processed C
> source code from the website, running the test again gives me similar
> results of 22.2 MB used and 24.55 MB high water from the same 9 MB file.
> Is there any way it could be the file that is causing the extra memory
> usage?  I'll try rebuilding it next.
> 

I just tried your test code with sqlite 3.6.1 amalgamation source on Win 
XP and I get the following output from a 17.5 MB (18,362,368 bytes) 
database file.

 Entries.db   Used: 18049 KB   High: 20357 KB

I had to make a few changes to your code to get it to run. I have copied 
the modified code below. This was built with Dev-Cpp using GCC 3.4.2.

There must be some other issue with your program that is causing the 
inflated memory usage you are seeing.

Dennis Cote




#include 
#include 
#include 

using namespace std;

int main(int argc, char *argv[])
{
sqlite3* pDataBase = NULL;
 const char* ptail = NULL;
sqlite3_stmt* pstatement = NULL;
int result = -1;
int cmdSize = 0;
const int cmdBufferSize = 1024;
char cmdBuffer[cmdBufferSize];
const char* pdatabaseName = "file_db";
const char* pfilename = argv[1];

sqlite3_open( ":memory:",  );

// create the attach command
cmdSize = sprintf( cmdBuffer, "ATTACH DATABASE '%s' AS %s", pfilename, 
pdatabaseName );

// attach the on-disk database with ATTACH filename.db AS filename
result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );

// You can enumerate all tables in a your on-disk-file in the mentioned 
scenario by
// doing a "SELECT tbl_name FROM filename.sqlite_master WHERE type = 
'table'".
cmdSize = sprintf( cmdBuffer, "SELECT tbl_name FROM %s.sqlite_master 
WHERE type = 'table'", pdatabaseName );

// prepare the statement
result = sqlite3_prepare_v2( pDataBase, cmdBuffer, cmdSize, 
, );

while( sqlite3_step( pstatement) == SQLITE_ROW)
{
// Then do a CREATE TABLE tableName AS SELECT * FROM 
filename.tableName On each table in the file,
// thus creating an in-memory copy of the DB and having done a 
select 
on each table (i.e. you'll see how  // much cache in memory 
will be 
used, etc.) 

// get the table name
const unsigned char* pname = sqlite3_column_text( pstatement, 
0);

// construct the command
cmdSize = sprintf( cmdBuffer, "CREATE TABLE %s AS SELECT * FROM 
%s.%s", pname, pdatabaseName, pname );

result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );
}

sqlite3_finalize(pstatement);

// detach the attached database to leave just the in memory database
cmdSize = sprintf( cmdBuffer, "DETACH DATABASE %s", pdatabaseName );

result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );

sqlite_int64 memHigh = sqlite3_memory_highwater(0);
sqlite_int64 memUsed = sqlite3_memory_used();

printf("%s   Used: %d KB   High: %d KB\n", pfilename, 
(int)(memUsed/1024), (int)(memHigh/1024));

sqlite3_close( pDataBase );

 system("PAUSE");
 return EXIT_SUCCESS;
}
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
Perhaps, when is the next release due?  I'd be interested to see the
differences, if an upgrade reduces memory overhead that significantly it
would be most excellent :)

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Wednesday, August 20, 2008 2:43 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 20, 2008, at 5:30 PM, Brown, Daniel wrote:

> I just upgraded to the latest version (3.6.1) of the pre-processed C
> source code from the website, running the test again gives me similar
> results of 22.2 MB used and 24.55 MB high water from the same 9 MB  
> file.
> Is there any way it could be the file that is causing the extra memory
> usage?  I'll try rebuilding it next.
>
>

My tests were based on CVS HEAD, which contains a newly rewritten page  
cache, and hence entirely new code for managing in-memory databases.   
Perhaps some kind of bug has been fixed since 3.6.1 that causes it to  
use less memory.



D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread D. Richard Hipp

On Aug 20, 2008, at 5:30 PM, Brown, Daniel wrote:

> I just upgraded to the latest version (3.6.1) of the pre-processed C
> source code from the website, running the test again gives me similar
> results of 22.2 MB used and 24.55 MB high water from the same 9 MB  
> file.
> Is there any way it could be the file that is causing the extra memory
> usage?  I'll try rebuilding it next.
>
>

My tests were based on CVS HEAD, which contains a newly rewritten page  
cache, and hence entirely new code for managing in-memory databases.   
Perhaps some kind of bug has been fixed since 3.6.1 that causes it to  
use less memory.



D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
I just upgraded to the latest version (3.6.1) of the pre-processed C
source code from the website, running the test again gives me similar
results of 22.2 MB used and 24.55 MB high water from the same 9 MB file.
Is there any way it could be the file that is causing the extra memory
usage?  I'll try rebuilding it next.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Brown, Daniel
Sent: Wednesday, August 20, 2008 1:50 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)

That is interesting, all that I've done to the source code locally is
added some C++ casts to get rid of compiler warnings, the C++ source I
posted earlier is the complete test I'm using, it is running on Windows
XP and its SQLite 3.5.1 from the pre-processed source code downloaded
from the site.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Wednesday, August 20, 2008 12:14 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 20, 2008, at 2:56 PM, Brown, Daniel wrote:

> Thank you, I imported our data from the source Excel file (.xls) using
> the third party SQLite Analyzer application
> (http://www.kraslabs.com/sqlite_analyzer.php) if that makes any
> difference?
>
> The size of the SQLite database on disc is 9,396,224 bytes so I was
> surprised when the memory usage ended up about three times that  
> amount,
> I assumed some sort of packing and unpacking could be going on but  
> from
> your comments I gather the memory usage is meant to be approximately  
> the
> size of the file on disc?
>

There is some extra overhead in memory.  But not 3x.  At least, not  
unless you are doing a big transaction or vacuuming the database or  
something like that.

I'm running experiments now.  My memory usage is about (1.15*disk +  
66184).  In other words, I'm seeing the in-memory database use about  
15% more space than the on-disk database.  I'm not sure what you are  
doing to get 3x memory usage.

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
That is interesting, all that I've done to the source code locally is
added some C++ casts to get rid of compiler warnings, the C++ source I
posted earlier is the complete test I'm using, it is running on Windows
XP and its SQLite 3.5.1 from the pre-processed source code downloaded
from the site.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Wednesday, August 20, 2008 12:14 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 20, 2008, at 2:56 PM, Brown, Daniel wrote:

> Thank you, I imported our data from the source Excel file (.xls) using
> the third party SQLite Analyzer application
> (http://www.kraslabs.com/sqlite_analyzer.php) if that makes any
> difference?
>
> The size of the SQLite database on disc is 9,396,224 bytes so I was
> surprised when the memory usage ended up about three times that  
> amount,
> I assumed some sort of packing and unpacking could be going on but  
> from
> your comments I gather the memory usage is meant to be approximately  
> the
> size of the file on disc?
>

There is some extra overhead in memory.  But not 3x.  At least, not  
unless you are doing a big transaction or vacuuming the database or  
something like that.

I'm running experiments now.  My memory usage is about (1.15*disk +  
66184).  In other words, I'm seeing the in-memory database use about  
15% more space than the on-disk database.  I'm not sure what you are  
doing to get 3x memory usage.

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread D. Richard Hipp

On Aug 20, 2008, at 2:56 PM, Brown, Daniel wrote:

> Thank you, I imported our data from the source Excel file (.xls) using
> the third party SQLite Analyzer application
> (http://www.kraslabs.com/sqlite_analyzer.php) if that makes any
> difference?
>
> The size of the SQLite database on disc is 9,396,224 bytes so I was
> surprised when the memory usage ended up about three times that  
> amount,
> I assumed some sort of packing and unpacking could be going on but  
> from
> your comments I gather the memory usage is meant to be approximately  
> the
> size of the file on disc?
>

There is some extra overhead in memory.  But not 3x.  At least, not  
unless you are doing a big transaction or vacuuming the database or  
something like that.

I'm running experiments now.  My memory usage is about (1.15*disk +  
66184).  In other words, I'm seeing the in-memory database use about  
15% more space than the on-disk database.  I'm not sure what you are  
doing to get 3x memory usage.

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
Thank you, I imported our data from the source Excel file (.xls) using
the third party SQLite Analyzer application
(http://www.kraslabs.com/sqlite_analyzer.php) if that makes any
difference?  

The size of the SQLite database on disc is 9,396,224 bytes so I was
surprised when the memory usage ended up about three times that amount,
I assumed some sort of packing and unpacking could be going on but from
your comments I gather the memory usage is meant to be approximately the
size of the file on disc?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Wednesday, August 20, 2008 11:44 AM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 20, 2008, at 2:22 PM, Brown, Daniel wrote:

> sqlite3_memory_highwater() ~ 25673060
> sqlite3_memory_used() ~ 23222709
>

OK.  I'll have a look


D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread D. Richard Hipp

On Aug 20, 2008, at 2:22 PM, Brown, Daniel wrote:

> sqlite3_memory_highwater() ~ 25673060
> sqlite3_memory_used() ~ 23222709
>

OK.  I'll have a look


D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Jeffrey Becker
More over, the memory usage reported by process explorer is subject to
the vagrancies of the windows memory allocator.  Generally the memory
usage you see reported in tools is quite a bit higher than the actual
memory usage of your application.  First off windows manages memory in
fixed size chunks so if you ask for X bytes of memory, windows
allocates n contiguous chunks such that n*chunk_size >= x.  However
windows just reports ~ # of allocated chunks * chunk_size as your
memory usage (if you use the right api).  More over most applications
simply report the total size of an application's "working set" which
includes lots of pages of memory that are potentially shared across
many applications.

On Wed, Aug 20, 2008 at 2:00 PM, D. Richard Hipp <[EMAIL PROTECTED]> wrote:
>
> On Aug 20, 2008, at 1:53 PM, Brown, Daniel wrote:
>
>> Looking in process explorer on XP after the disc database detached
>> should a memory size change of 28 MB of RAM in the test application, I
>> assumed this was the size of the database in memory.
>
>
> That would be the peak memory usage by the application.  It is not at
> all clear to me that SQLite was using all 28 MB.  What does
> sqlite3_memory_highwater() tell you?
>
>
> D. Richard Hipp
> [EMAIL PROTECTED]
>
>
>
> ___
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
sqlite3_memory_highwater() ~ 25673060
sqlite3_memory_used() ~ 23222709

I'm doing the following in C++ (I test all return codes but removed the tests 
to save bandwidth):

const char* ptail = NULL;
sqlite3_stmt* pstatement = NULL;
int result = -1;
int cmdSize = 0;
const int cmdBufferSize = 1024;
char cmdBuffer[cmdBufferSize];

sqlite3_open( ":memory:",  );

// create the attach command
cmdSize = sprintf( cmdBuffer, "ATTACH DATABASE '%s' AS %s", pfilename, 
pdatabaseName );

// attach the on-disk database with ATTACH filename.db AS filename
result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );

// You can enumerate all tables in a your on-disk-file in the mentioned 
scenario by 
// doing a "SELECT tbl_name FROM filename.sqlite_master WHERE type = 
'table'". 
cmdSize = sprintf_s( cmdBuffer, "SELECT tbl_name FROM %s.sqlite_master 
WHERE type = 'table'", pdatabaseName );

// prepare the statement
result = sqlite3_prepare_v2( pDataBase, cmdBuffer, cmdSize, 
, );

while( sqlite3_step( pstatement) == SQLITE_ROW)
{
// Then do a CREATE TABLE tableName AS SELECT * FROM 
filename.tableName On each table in the file, 
// thus creating an in-memory copy of the DB and having done a 
select on each table (i.e. you'll see how// much cache 
in memory will be used, etc.) 

// get the table name
const unsigned char* pname = sqlite3_column_text( pstatement, 
0);

// construct the command
cmdSize = sprintf( cmdBuffer, "CREATE TABLE %s AS SELECT * FROM 
%s.%s", pname, pdatabaseName, pname );

result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );
}

sqlite3_finalize(pstatement);

// detach the attached database to leave just the in memory database
cmdSize = sprintf( cmdBuffer, "DETACH DATABASE %s", pdatabaseName );

result = sqlite3_exec( pDataBase, cmdBuffer, NULL, NULL, NULL );

sqlite_int64 memHigh = sqlite3_memory_highwater(0);
sqlite_int64 memUsed = sqlite3_memory_used();

printf("%s %d KB High %d KB", pfilename, (memUsed/1024), 
(memHigh/1024));

sqlite3_close( pDataBase );

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Wednesday, August 20, 2008 11:00 AM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 20, 2008, at 1:53 PM, Brown, Daniel wrote:

> Looking in process explorer on XP after the disc database detached
> should a memory size change of 28 MB of RAM in the test application, I
> assumed this was the size of the database in memory.


That would be the peak memory usage by the application.  It is not at  
all clear to me that SQLite was using all 28 MB.  What does  
sqlite3_memory_highwater() tell you?


D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread D. Richard Hipp

On Aug 20, 2008, at 1:53 PM, Brown, Daniel wrote:

> Looking in process explorer on XP after the disc database detached
> should a memory size change of 28 MB of RAM in the test application, I
> assumed this was the size of the database in memory.


That would be the peak memory usage by the application.  It is not at  
all clear to me that SQLite was using all 28 MB.  What does  
sqlite3_memory_highwater() tell you?


D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
Looking in process explorer on XP after the disc database detached
should a memory size change of 28 MB of RAM in the test application, I
assumed this was the size of the database in memory.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Wednesday, August 20, 2008 10:44 AM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 19, 2008, at 6:24 PM, Brown, Daniel wrote:

> SQLite is very memory
> intensive compared to our current solution (although SQLite is faster
> and more feature rich), e.g. 9MB for our current solution versus 28 MB
> for SQLite with the same source data.


Where did you get the 28MB figure?  The sqlite3_analyzer output you  
posted tells me that the total database size is a little over 9MB, not  
28MB.

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread D. Richard Hipp

On Aug 19, 2008, at 6:24 PM, Brown, Daniel wrote:

> SQLite is very memory
> intensive compared to our current solution (although SQLite is faster
> and more feature rich), e.g. 9MB for our current solution versus 28 MB
> for SQLite with the same source data.


Where did you get the 28MB figure?  The sqlite3_analyzer output you  
posted tells me that the total database size is a little over 9MB, not  
28MB.

D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-20 Thread Brown, Daniel
.. 7 
Unused bytes on primary pages. 139119.4% 
Unused bytes on overflow pages 0 
Unused bytes on all pages. 139119.4%



-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of D. Richard Hipp
Sent: Tuesday, August 19, 2008 3:43 PM
To: General Discussion of SQLite Database
Subject: Re: [sqlite] Reducing SQLite Memory footprint(!)


On Aug 19, 2008, at 6:24 PM, Brown, Daniel wrote:

> Good afternoon list,
>
> Are there any known techniques for reducing SQLite's memory footprint?

Construct your database into a file (say "test.db").  Then run the  
sqlite3_analyzer utility (available for download on the SQLite  
website) over that file and post the results.  The sqlite3_analyzer  
utility will give us additional information that might suggest ways of  
reducing the size of the database file.

See also http://www.hwaci.com/sw/sqlite/prosupport.html#compress


D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


Re: [sqlite] Reducing SQLite Memory footprint(!)

2008-08-19 Thread D. Richard Hipp

On Aug 19, 2008, at 6:24 PM, Brown, Daniel wrote:

> Good afternoon list,
>
> Are there any known techniques for reducing SQLite's memory footprint?

Construct your database into a file (say "test.db").  Then run the  
sqlite3_analyzer utility (available for download on the SQLite  
website) over that file and post the results.  The sqlite3_analyzer  
utility will give us additional information that might suggest ways of  
reducing the size of the database file.

See also http://www.hwaci.com/sw/sqlite/prosupport.html#compress


D. Richard Hipp
[EMAIL PROTECTED]



___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


[sqlite] Reducing SQLite Memory footprint(!)

2008-08-19 Thread Brown, Daniel
Good afternoon list,

Are there any known techniques for reducing SQLite's memory footprint?
I am currently evaluating SQLite as a possible replacement to our
current proprietary database solution.  However SQLite is very memory
intensive compared to our current solution (although SQLite is faster
and more feature rich), e.g. 9MB for our current solution versus 28 MB
for SQLite with the same source data.  Due to our platforms limitations
we need to be able to run our database solution completely in RAM in our
worst case environment we don't have any writable storage available:
this prevents the use of SQLite's paging facility except for read only
tables.  Our current solution achieves its small memory footprint by bit
packing where we specify the minimum and maximum values for each column
the system then uses the smallest number of bits possible to represent
integer, bit and float values, strings are the exception which we use
UTF-8 to store.  

So is there any existing techniques I can leverage to reduce the memory
footprint for SQLite?  Ideally I'd like to be able to stay inside our
original memory footprint of 9 MB, which I think could be achievable
with a combination of paging our read only tables from disc and keeping
our writable tables 100% in memory with some sort of compression being
applied.

Cheers,

Daniel Brown | Software Engineer
"The best laid schemes o' mice an' men, gang aft agley"


___
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users