Could you provide a little more clarification on what you're doing?

Are you trying to use the CLI <while> the Python script is doing inserts?

When you try to do an update or delete with the CLI does it hang and not 
complete, or does it happily continue on and let you keep going?

If you do ".changes on" before running the query, does the reported change 
count increase?

If there's no sensitive data in the schema could you share the schema and/or 
copy the screen text from the CLI with an example?

-----Original Message-----
From: sqlite-users [] On 
Behalf Of Fiona
Sent: Thursday, October 12, 2017 5:41 AM
Subject: [sqlite] Sqlite3.6 Command-line delete/update not working with large 
db file(>280GB)

Here is the specifics of my problem:
ubuntu,  sqlite 3.6.20

I have only two tables, each with primary key and index, I use python code
to insert/update these two data, one table have a column with large blob
Now I have a db file of about 289GB in size, when I updata/delete with
command-line,  the data is not changed/deleted at all, and no error ever
returned, while insert is still working. 

I look through the sqlite limits, it says practically there is no limit
about the size of a db file given that you have enough disk space. So please
help me, where I can look into to solve this? 

Thanks a lot!

Sent from:
sqlite-users mailing list
sqlite-users mailing list

Reply via email to