Hello SQLite users,

I've been running into some disk I/O errors when doing things such as 
vacuuming and/or inserting things into temp tables in a database. The 
databases that are giving me trouble are quite large: between 29 and 
55GB. However, as large as that is, I don't think running out of disk 
space is the issue as I have about 3TB of free space on the disk. So, my 
question is, is there a maximum size that databases can be? If so, what 
is the limiting factor? The databases in question don't seem to be 
corrupt; I can open them on the command line and in python programs 
(using pysqlite) and can read triggers from them just fine. It's just 
when I try to vacuum and create temp tables that I run into trouble.

If you need to know, I am running sqlite version 3.5.9 on CentOS 5.3.

Thanks,
Collin Capano
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to