Re: [sqlite] Can write to database using API but get an error on writing with sqlite3.exe
Em qui, 27 de set de 2018 às 00:59, Luiz Américo escreveu: > > > Em qui, 27 de set de 2018 às 00:30, Keith Medcalf > escreveu: > >> >> Have you checked to make sure the ID that you are using (interactively) >> is permitted read/write access to the directory containing the db files and >> to the files themselves? I mean *actually* checked that you have >> permission, since I doubt that the CGI process is running with the same ID >> you are using to logon. >> >> > Good call. I will check > That was the problem. I executed the external app with admin privileges and worked Many thanks Luiz > ___ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
Re: [sqlite] Can write to database using API but get an error on writing with sqlite3.exe
Em qui, 27 de set de 2018 às 00:30, Keith Medcalf escreveu: > > Have you checked to make sure the ID that you are using (interactively) is > permitted read/write access to the directory containing the db files and to > the files themselves? I mean *actually* checked that you have permission, > since I doubt that the CGI process is running with the same ID you are > using to logon. > > Good call. I will check but i can write another database file located in the same directory with external tools. The difference of this second db is that is a lot smaller, is not accessed frequently and don't have wal shm files (not sure what is the lock mode) Luiz ___ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
[sqlite] Can write to database using API but get an error on writing with sqlite3.exe
I have a 1GB database configured to use WAL mode. It's accessed by short lived CGI processes, under windows 2008 server. Each process usually opens the database, execute a select or update/insert, close the connection and finishes. It uses c api (sqlite3_open, sqlite3_prepare, sqlite3_step) of dll downloaded from sqlite.org. All operations are working fine (reading, writing to the database). Those processes are started ~1 to 5 per second There's a ~5MB wal and shm file that are being updated frequently When i open the same database with sqlite3.exe (or an external tool like Sqlite Expert) i can do read commands (select), but when i try to write to database (UPDATE / ALTER TABLE) it gives a "Attempt to write a read only database" error. The tools are executed in the same machine as the datafile If i try to open a (smaller, less accessed) database in the same folder as the main one all writing operations are allowed in the external tools Any hints on what's going on? Luiz ___ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
Re: [sqlite] Mailing list shutting down...
How about using https://www.discourse.org/ ? Open source projects can use for free Luiz Em qua, 13 de jun de 2018 14:37, John Long escreveu: > On Wed, 2018-06-13 at 21:42 +1000, Gary R. Schmidt wrote: > > > > > > This is an increasing problem, and has been discussed on the Mailman > > mailing list recently, you should join them and see what mitigation > > strategies are available. > > Well I'm sure he would like to, but subscriptions have probably been > suspended because of the attacks ;) > > ___ > sqlite-users mailing list > sqlite-users@mailinglists.sqlite.org > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users > ___ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
[sqlite] [AGAIN] SQLite on network share
Em 13/11/2015 15:29, "A. Mannini" escreveu: > > Hi, > > i read SQLite FAQ and understood that use of SQLite on network share CAN > corrupts database file. > Fo me, it isn't clear if there is a way to safely use SQLite on a > network share in contests with few clients (max 5 for ex) and low read / > write concurrency.. Disclaimer I'm pretty aware of warnings about network share, and that is a bad design. I have two systems running with sqlite db shared in a windows only network running for almost 5 years. No issues related to the network until now. The apps are mostly read and db file size is around 20 MB. I took the precaution to not update the sqlite DLL to avoid different versions acessing the same db Luiz > > Thanks > Alessandro > ___ > sqlite-users mailing list > sqlite-users at mailinglists.sqlite.org > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
[sqlite] Handling the whole select query from an index
Em 19/09/2015 11:13, "Rob Willett" < mail.robertwillett.com at postfix.robertwillett.com> escreveu: > > Ryan, > > Thanks. We?ve got 100 requests a second which may be enough. We?ll keep looking though for any time. > Have you considered to implement the service using a compiled language, configured to run through FastCGI? I'm sure it can beat your current performance and such simple service can be easily done even if you don't master the language. There are lot of resources on net how to do with C, C++ or even Go. In my side, i build web services using FreePascal with sqlite as db. It runs in apache through CGI with Fastcgi as an option. Service like yours can be implemented in a few minutes. Luiz > Rob. > > > On 18 Sep 2015, at 18:26, R.Smith wrote: > > > > >>>Rob: "We want to do postal code ==> GPS..." > > > > >>Me: "You can use google apis..." > > > > >Rob: "Our business is GPS and GIS traffic data" > > > > Oops yes, that's a whole nother kettle of fish then. To return to some of the older parts of the conversation, I think the SQL route is best (whichever engine is chosen) - I am sure you will need to expand in future. I hope you get the PERL latency sorted out. > > > > Best of luck! > > Ryan > > > > > > ___ > > sqlite-users mailing list > > sqlite-users at mailinglists.sqlite.org > > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users > > ___ > sqlite-users mailing list > sqlite-users at mailinglists.sqlite.org > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users
[sqlite] First test of json and index expressions, not so good
Em 04/09/2015 17:09, "Domingo Alvarez Duarte" escreveu: > > Hello ! > > This is my first test with json and index expressions ! > > How to make usage of the expression index on queries ? > > Why is json_extract so slow ? Just for information Is json_extract from https://github.com/groner/sqlite-json ? Luiz > > Cheers ! > > Output > > Time to insert50000.032227records by second = > 155149 > count=4999 > json=the_value_1 > 000SCAN TABLE json_tbl > Time to select raw50000.00244records by second = > 2.04918e+06 > Time to select json_extrat no index50008.12196records by > second = 615.615 > Time to create index50000.00605records by second = > 826446 > 000SCAN TABLE json_tbl > Time to select json_extrat indexed50007.38144records by > second = 677.375 > > > > > > Program > > local max_count = 5000; > local sql = "CREATE TABLE IF NOT EXISTS json_tbl(id integer primary key, json > text collate nocase);"; > > local db = SQLite3(":memory:"); > db.exec_dml(sql); > > local stmt = db.prepare("insert into json_tbl(json) values(?);"); > > local start = os.clock(); > > db.exec_dml("begin;"); > for(local i=1; i < max_count; ++i) > { > stmt.bind(1, format([==[{"id" : %d, "value" : "the_value_%d"}]==], i, > i)); > stmt.step(); > stmt.reset(); > } > stmt.finalize(); > db.exec_dml("commit;"); > > local time_spent = os.clock() -start; > print("Time to insert", max_count, time_spent, "records by second = ", > max_count/time_spent); > > > print("count=", db.exec_get_one("select count(*) from json_tbl")); > print("json=", db.exec_get_one("select json_extract(json, '$.value') AS val > from json_tbl where val = 'the_value_1';")); > > sql = "select json_extract(json, '$.value') AS val from json_tbl where > val = ?;"; > > local showPlan = function() > { > stmt = db.prepare("explain query plan " + sql); > while(stmt.next_row()) > { > local line = ""; > for(local i=0, len = stmt.col_count(); i < len; ++i) > { > line += "\t" + stmt.col(i); > } > print(line); > } > stmt.finalize(); > } > > showPlan(); > > start = os.clock(); > stmt = db.prepare("select * from json_tbl"); > while(stmt.next_row()) > { > } > stmt.finalize(); > > time_spent = os.clock() -start; > print("Time to select raw", max_count, time_spent, "records by second = ", > max_count/time_spent); > > start = os.clock(); > stmt = db.prepare(sql); > for(local i=1; i < max_count; ++i) > { > stmt.bind(1, format("the_value_%d", i)); > stmt.step(); > //print(stmt.col(0)); > stmt.reset(); > } > stmt.finalize(); > > time_spent = os.clock() -start; > print("Time to select json_extrat no index", max_count, time_spent, "records > by second = ", max_count/time_spent); > > start = os.clock(); > db.exec_dml("create index json_tbl_idx on json_tbl(json_extract(json, > '$.value'));"); > time_spent = os.clock() -start; > print("Time to create index", max_count, time_spent, "records by second = ", > max_count/time_spent); > > showPlan(); > > start = os.clock(); > stmt = db.prepare(sql); > for(local i=1; i < max_count; ++i) > { > stmt.bind(1, format("the_value_%d", i)); > stmt.step(); > //print(stmt.col(0)); > stmt.reset(); > } > stmt.finalize(); > > time_spent = os.clock() -start; > print("Time to select json_extrat indexed", max_count, time_spent, "records > by second = ", max_count/time_spent); > > db.close(); > > > ___ > sqlite-users mailing list > sqlite-users at mailinglists.sqlite.org > http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users