Hello Martin,

If it was me I'd "investigate" the problem by doing the "right" thing in
the first place, by which time I'd know enough to knock up the "wrong"
solution for the doubters before presenting  the "proper" solution as a
fait accompli.


That's already been done. It is more or less that I now harvest the
misinterpretation of my own words and have to implement pretty nonsense
stuff.... It seems like you're right inside my mind ;-)

If I remember correctly, there's no random access to BLOBs so all you'd
be doing is storing a chunk of data and reading the whole lot back. I
don't think that's a realistic test - the time it takes SQLite to find
the pages/data will be a tiny fraction of the time it will take to read
that data off the disk. You can't compare performance against reading
"records" out of the flat file because "they" won't let you do that. In
all it doesn't sound very scientific. ;)


That's absolutely correct, that's why I am so relaxed considering that I
have to prove that elephants rarely fly. I'll be using SQLite as a file
system as already pointed out, so the only overhead compared to reading the
flat binary file is that tiny little time needed to access the record.
Unless I miss something, there'll be no penalty there. However, accessing
the ij-th element of each array stored in a rational database and collect
all these in a vector, will be much faster if I use a nice schema rather
than reading digging int o the binary files. This is my major usecase.

Thanks a lot for the help!!!!

dimitris

Reply via email to