Hello,

I'm currently using SQLite3 on my multi-threaded software.

I have tried several ways for dealing with my issue, however, I came to the 
conclusion that there must be some trick I havent been told of.

Heres the situation,  Theres a big amount of input coming in from the software 
to the database. All client-side.   This input must be reflected on the 
database.  However, many transactions are lost. Why?, Well, it seems like 
either the database was locked, or the transaction file.

Now, with my multi-threaded aproach, I used to open/close the db, to get a new 
handle on each transaction... but, since everything happens so fast, the db is 
locked anyway.



How in earth could I solve my problem?.

I thought about using a scheduler... a small system that would make a queue 
list and process the data slowly onto the database... But, thats not a cheap 
idea (resource / processing wise) and, not elegant.. at all.


What should I do?, I nearly gave up.. Tried a lot of things.

I'm using transactions from the beginning.. And I also tried to open / close 
transactions every X seconds (5, 10 seconds) for preventing database lockage, 
but, it seems like the transaction file is locked anyway when queries happen to 
be sent in a very small period of time.

Should I go with the queue list? (totally impractical, stupid)
Or are there any better methods out there for dealing with all this nonsense?.


Thanks in advance to everyone that can help me out and is willing to.




Gusso

Reply via email to