On Mar 3, 2010, at 12:57 PM, Collin Capano wrote:

> Hello SQLite users,
>
> I've been running into some disk I/O errors when doing things such as
> vacuuming and/or inserting things into temp tables in a database. The
> databases that are giving me trouble are quite large: between 29 and
> 55GB. However, as large as that is, I don't think running out of disk
> space is the issue as I have about 3TB of free space on the disk.  
> So, my
> question is, is there a maximum size that databases can be? If so,  
> what
> is the limiting factor? The databases in question don't seem to be
> corrupt; I can open them on the command line and in python programs
> (using pysqlite) and can read triggers from them just fine. It's just
> when I try to vacuum and create temp tables that I run into trouble.

Running out of space on the /tmp partition perhaps. See pragma
temp_store_directory:

   http://www.sqlite.org/pragma.html#pragma_temp_store_directory

Dan.

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to