SQLite bug report
 
Summary:
--------------

error message:

       Error: near line 2: database or disk is full

It happens with plenty of disk space available and with 'unlimited' database 
size.
It does not happen on all systems. It does not happen on small databases.

 
Details:
----------


The error occurs on 
- windows server 2003 SP2 64bit, windows server 2009 64bit
- windows vista, windows 7 rc1, windows 7 - all 64-bit

The error does _not_ occur on 
- windows server 2003 SP2 32bit, windows xp sp3 32 bit

SQLite3 versions affected: all the ones released in 2009, possibly all the 
earlier ones, too.


The bug was 'distilled' with the reproduction described below. 



Reproduction
------------------

We (1) create a table in the database, (2) create an input file; (3) import the 
input file into the table mulltiple times.

Using the files attached below the error did occur during the third/fourth 
import, at database size around 22-28gb.



Additional information and observations
----------------------------------------------------

1. This is not a new bug - it's been around for the last year, probably two - 
but it was difficult to spot and isolate.

2. The problem is 'touchy' - with different random seed in the generated table 
the problem may occur at 45gb db size instead of 20-30gb. When I tried to use, 
as input data, the same line repeated millions of times - the problem did not 
occur at all - tests were aborted by me at 160-260 gb database size.

3.The problem does _not_ depend _purely_ on the data being imported. The 
reproduction script loads the same data set multiple times and fails on the 
third/fourth time - so it is the database size which triggers it.

4. When the input file is smaller (1gb instead of 8gb), the problem still 
occurs in the 20-30gb database size range - so there is nothing magical about 
the number of imports.

5. The test database here is created with pragma page_size = 32768. The same 
error message occurs for other page sizes and at various cache sizes. 

6. In some other tests (not using to the scripts here, with different data) the 
import was good (about 20gb size) but an attempt to create an index on the 
imported data resulted in the same error diagnostic - after about 30mins of 
running.

7. This is not an SQLITE3 problem - when using .NET wrapper for SQLite or using 
any of the admin tools - the problems occurs in the same area, even though 
messages are not reported the same way by these tools [and I am getting 
sometimes 'database image in malformed' after such tools].

 
 


 
Scripts to reproduce the bug
--------------------------------------

 
++++++++++++++++++++++++++++++++++++++++++++++
====== awk program to produce the table - save as: t1b.w
BEGIN {
srand(13) # assure all tests have the same data
for (i=0;i<COUNT;i++) {
str = ""+i "," strftime("%Y-%m-%d %H:%M:%S") "," strftime("%Y-%m-%d %H:%M:%S") 
"," rand()*10 "," rand()*100 "," rand()*1000 "," rand()*10000 "," 
rand()*1000000 "," rand()*10 "," rand()*100 "," rand()*1000 "," rand()*10000 
"," rand()*100000 "," rand()*1000000 "," rand() "," rand()*100 "," rand()*1000 
"," rand()
print str
} # END FOR
} # END BEGIN

++++++++++++++++++++++++++++++++++++++++++++++
====== sql script to prepare database - save as: prepdb.sql
pragma page_size=32768;
pragma default_cache_size=10000;

CREATE TABLE [large] (
ID integer NOT NULL,
SDATETIME datetime NOT NULL,
PREVSDATETIME datetime,
var1 real,
var2 real,
var3 real,
var4 real,
var5 real,
var6 real,
var7 real,
var8 real,
var9 real,
var10 real,
var11 real,
var12 real,
var13 real,
var14 real,
var15 real
);

++++++++++++++++++++++++++++++++++++++++++++++
======================== input to sqlite3 - import data file: inpb.sql
.separator ,
.import inpb.txt large

++++++++++++++++++++++++++++++++++++++++++++++
====== dos cmd script to run the test, using the files above
echo ---------- prepare database
del tstb.db
rem prepare database with table schema
c:\apps\sqlite3 tstb.db < prepdb.sql

 
time /T
echo ---------- prepare input file with sufficient number of csv records [par 
COUNT], 50m records give about 8gb data
c:\apps\gawk -f t1b.w -v COUNT=50000000 > inpb.txt
 
echo --------- load table with data sets the first 10 times
time /T
for %%i in (1,2,3,4,5,6,7,8,9,10) do c:\apps\sqlite3 tstb.db <inpb.sql
time /T

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to