why don't you send us some code.  It sounds like you might have an
issue managing your connections.

On Sat, Jan 31, 2009 at 3:09 PM, Boris Arloff <boris.arl...@yahoo.com> wrote:
> Hi,
>
> I am having a small problem with an sqlite3 v3.5.6 database being accessed 
> from within python 2.5 (import sqlite3 as sqlite).
>
> The database has been working correctly with the exception of the following 
> issue:
> There are 17 tables with the first table being a dataset index table with 
> three columns (index, name, datetimestamp).  All other records have various 
> columns one being a field that stores the index reference to this first 
> table.  Therefore we can access different datasets.
>
> If we create several datasets each with an incremental index (1,2,3, ...n) in 
> the first table, we can then delete the last dataset n and recreate a new one 
> with same name and index number; no problem.
>
> The problem results when we delete a dataset less than n (last created).  
> When any dataset 1 thru n-1 is deleted and we attempt to reenter its 
> information, the following happens:
>
> 1. Delete all records from all tables where the dataset id is lets say 3 
> (with n > 3)
> 2. Then insert all data related to dataset id 3.
> 3.. The data set index table entry is successful with index=3, name set to 
> the original name and new datetimestamp.
> The second table to be updated gets the its first record inserted correctly, 
> corresponding to this dataset.
> The third and subsequent tables however fail to accept the insert and sqlite 
> declares an "sqlite3.OperationalError" with database is locked.
>
> This operation using the last set of data entered does not report this same 
> error and completes correctly.  For example if we had only three datasets in 
> the case above, then it would have completed successfully.  As a special case 
> if we only have one single dataset (i.e. n=1), then we can repeat the 
> operation successfully until we drop dead.  We can insert all the original 
> data in every table, delete the data, then recreate the same data, and 
> repeat.  As soon as a second dataset is created we can repeat the process 
> with records that pertain to dataset 2 only, any attempts to do the same to 
> dataset 1 causes the "database is locked" error.
>
> Does anyone can help with this?  It seems to me that sqlite corrupts the 
> index when it is deleted, hence perhaps we cannot reuse the same index number.
>
> Thanks,
> Boris
>
>
>
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to