On 1/13/16, tonyp at acm.org wrote:
> Compiling under Linux with SQLITE_OMIT_WAL I get this error:
>
> ./sqlite3.c: In function ?sqlite3PagerJrnlFile?:
> ./sqlite3.c:50209:16: error: ?Pager? has no member named ?pWal?
>return pPager->pWal ? sqlite3WalFile(pPager->pWal) : pPager->jfd;
>
Bummer
On 13 Jan 2016, at 7:36pm, Warren Young wrote:
> On Jan 13, 2016, at 5:29 AM, Simon Slavin wrote:
>
>> My only concern with what you wrote is that you mention 100 tables. In
>> order to find the table you specify SQLite has to go through a list of
>> (hashed, I think) table names, and going
on/pgp-signature
Size: 842 bytes
Desc: Message signed with OpenPGP using GPGMail
URL:
<http://mailinglists.sqlite.org/cgi-bin/mailman/private/sqlite-users/attachments/20160113/ee543fd9/attachment.pgp>
On Wed, Jan 13, 2016 at 4:06 PM, Olivier Mascia wrote:
> Some kind of MVCC is very interesting to us for the purpose of running
> read transactions which would see a stable view of the data, not seeing at
> all the writes committed after the start of the read transactions. If I'm
> not misinterpr
If you ever going to use ANALYZE on your database and database is going to be
open frequently (like once per request) consider dropping sqlite_stat3 and
sqlite_stat4 tables.
SQLite reads content of those tables on each open. Number of tables greatly
contributes to amount of data stored in ther
Thanks Simon & Dominique,
> Le 13 janv. 2016 ? 13:29, Simon Slavin a ?crit :
>
>> Does sqlite have to reparse the schema text often to execute the queries? Or
>> is the schema somehow translated internally to a, stored, digested
>> ('compiled') format, to ease its working?
>
> The first time
On Jan 13, 2016, at 1:45 PM, Simon Slavin wrote:
>
> On 13 Jan 2016, at 7:36pm, Warren Young wrote:
>
>> Wouldn?t that be log2(100) = 6.6 or log(100 = 4.6 maximum node visits?
>>
>> Most hash table implementations have logarithmic lookup time, not linear.
>
> You're quite right.
No, not enti
On Wed, Jan 13, 2016 at 12:43 PM, Olivier Mascia wrote:
> Is there any known structural performance issue working with a schema made
> of [many] tables, [...] foreign keys constraints, [...] indexes [...]
> primary keys and foreign keys. [...] tables would have [many] columns and
> [...] rows
>
The "pre-release snapshot" currently on the
https://www.sqlite.org/download.html page is the release candidate for
version 3.10.1. See https://www.sqlite.org/draft/index.html and
https://www.sqlite.org/draft/releaselog/3_10_1.html for further
information about this patch release.
There are minima
Hello,
Is there any known structural performance issue working with a schema made of
about 100 tables, about 80 foreign keys constraints, and some indexes in
addition to those implicit of the primary keys and foreign keys. In my book it
does not qualify as a complex schema, some tables would ha
On Jan 13, 2016, at 5:29 AM, Simon Slavin wrote:
>
> My only concern with what you wrote is that you mention 100 tables. In order
> to find the table you specify SQLite has to go through a list of (hashed, I
> think) table names, and going through an average of 50 of them per command
> can be
On 13 Jan 2016, at 11:43am, Olivier Mascia wrote:
> Is there any known structural performance issue working with a schema made of
> about 100 tables, about 80 foreign keys constraints, and some indexes in
> addition to those implicit of the primary keys and foreign keys. In my book
> it does
If it's OK to use the sqlite3 cmd line shell then try this:
-- create test input table X
create table X (what text);
insert into X values ('abc,def');
-- write X to a file
.output mydata.csv
select * from X;
.output stdout
-- read it back in to parse it
create table Y (a text, b text);
.mode csv
On Tue, 12 Jan 2016 21:58:01 +0100
Christian Schmitz wrote:
>
> > Am 20.12.2015 um 19:12 schrieb Big Stone :
> >
> > Hi All,
> >
> > To prepare for 2016 greetings moment, here is my personnal whish
> > list
>
> Unless I missed something, I may suggest
>
> * moveprev
> * movefirst
> * movelas
Hi,
On Wed, Jan 13, 2016 at 10:05 AM, Jim Morris wrote:
> Might be doable with a recursive CTE
>
> On 1/13/2016 1:22 AM, Bart Smissaert wrote:
>>
>> It probably can be done with just SQLite's built-in text functions such as
>> instr and substr,
>> although with 20 to 30 items it may get a bit mes
At 08:28 13/01/2016, you wrote:
>On Wed, Jan 13, 2016 at 2:39 AM, Simon Slavin
>wrote:
>
> > On 12 Jan 2016, at 11:56pm, Scott Hess wrote:
> >
> > > If I am writing a client that can read SQLite databases, then I
> probably
> > > don't want your database to be injecting a bunch of arbitrary PRA
You will thank yourself by using a scripting language such Ruby, php, or
python.
Is there a reg ex library for sqlite that could be employed?
On Tue, Jan 12, 2016 at 11:42 PM, audio muze wrote:
> I have a table of roughly 500k records with a number of fields
> containing delimited text that nee
It probably can be done with just SQLite's built-in text functions such as
instr and substr,
although with 20 to 30 items it may get a bit messy and complex.
RBS
On Wed, Jan 13, 2016 at 5:42 AM, audio muze wrote:
> I have a table of roughly 500k records with a number of fields
> containing deli
On 13 Jan 2016, at 7:28am, Dominique Devienne wrote:
> You have in my opinion taken this out of context, and are assuming the
> important part is the application, and not the data (i.e. database file).
I apologise. I didn't read back down the thread before replying. Sorry.
Simon.
On 13 Jan 2016, at 5:42am, audio muze wrote:
> The number of delimited entries embedded in a
> field can vary from none to as man as 20/30. Is there an addin I can
> compile with SQLite that provides the ability to parse a string?
What do you mean by "parse" ? Just to separate a string into i
On Wed, Jan 13, 2016 at 2:39 AM, Simon Slavin wrote:
> On 12 Jan 2016, at 11:56pm, Scott Hess wrote:
>
> > If I am writing a client that can read SQLite databases, then I probably
> > don't want your database to be injecting a bunch of arbitrary PRAGMA
> calls
> > into my client.
>
> It is, afte
On Wed, Jan 13, 2016 at 12:42 AM, Jean-Christophe Deschamps <
jcd at antichoc.net> wrote:
> At 08:28 13/01/2016, you wrote:
>
>> On Wed, Jan 13, 2016 at 2:39 AM, Simon Slavin
>> wrote:
>> > On 12 Jan 2016, at 11:56pm, Scott Hess wrote:
>> > > If I am writing a client that can read SQLite databas
I have a table of roughly 500k records with a number of fields
containing delimited text that needs to be parsed and written to
separate tables as a master lists. In order to do this I need to
parse the field contents, however, I don't see any functions within
SQLite to enable that. The number of
Might be doable with a recursive CTE
On 1/13/2016 1:22 AM, Bart Smissaert wrote:
> It probably can be done with just SQLite's built-in text functions such as
> instr and substr,
> although with 20 to 30 items it may get a bit messy and complex.
>
> RBS
>
> On Wed, Jan 13, 2016 at 5:42 AM, audio mu
On 12 Jan 2016, at 11:56pm, Scott Hess wrote:
> If I am writing a client that can read SQLite databases, then I probably
> don't want your database to be injecting a bunch of arbitrary PRAGMA calls
> into my client.
It is, after all, the equivalent of an autoexecute macro. And we all know how
Hi:
Not sure what is the cost to first export the fields a flat file? Try
https://github.com/elau1004/TFR4SQLite/wiki (may or may not fit your needs).
It was intended to query flatfile with crazy delimiters, using SQLite query
engine. If you can SELECT from a text file then you can insert th
26 matches
Mail list logo