Hi, all:
SUMMARY:
Blob of only 131k length seems to cause seg fault using sqlite.
DETAILS:
I have been tormented by occasional seg faults that happen when I use
sqlite. The good thing is that whatever case causes a crash, I can
produce it all the time.
Here's what I use to create a crash:
I create the following table:
CREATE TABLE data (name string, shortname string, stage integer, enc blob);
Then I insert one row into it with this code (copied from example when I
downloaded data/sqlite):
require 'data/sqlite'
db=: 'psqlite'conew~ './whatever.sqlite'
fin=. 1!:21 <
'/tmp/longstring'
s=. 1!:1 < fin
1!:22 fin
('name';'name';100;s) apply__db 'insert into data values (?,?,?,?);'
The
content of /tmp/longstring is 131038 bytes long. Its entire text
content can be found here: http://pastebin.com/raw.php?i=tMBbfcRQ . It's
nothing special; just readable
text characters.
Later, when I attempt to read that one row from sqlite, I get a segfault:
strquery__db 'select enc from data;'
I've played around a while to determine the actual reason. I'm certain it's
not just the length, because I've had success with strings of many
megabytes.
Specs:
I am running 64-bit Ubuntu 11.10 Oneirc - also fails on Ubuntu Maverick
j 701 - also fails on j602
the default sqlite 3 library that came with Oneirc - also fails with sqlite
3.7.10
Can anyone help? I haven't seen this sort of problem while looking through the
jsoftware archives.
--- Sarino Suon
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm