Thanks for pointing out the obvious :)
Seriously though, there are times when probably all of us has made "just
a simple database" that was not normalized in the correct way that later
turns out to be used a lot more than intended. Normalizing the database
at a later state requires a lot of more reprogramming and rewriting a
lot of sql. I could see a use of this kind of functionality but the best
way would always be to normalize.
But then again I was just curios to see if anyone had tried or thought
about something like this before. I'm not even sure I would like this
type of functionality implemented in SQLite
Best regards
Daniel
John Stanton wrote:
Your solution here is to normalize your database. Third normal form
will do it for you.
Daniel Önnerby wrote:
Just out of curiosity.
If I for instants have 1000 rows in a table with a lot of blobs and a
lot of them have the same data in them, is there any way to make a
plugin to sqlite that in this case would just save a reference to
another blob if it's identical. I guess this could save a lot of
space without any fancy decompression algorithm, and if the
blob-field is already indexed there would be no extra time to locate
the other identical blobs :)
Just a thought :)
John Stanton wrote:
What are you using for compression?
Have you checked that you get a useful degree of compression on that
numeric data? You might find that it is not particularly amenable
to compression.
Hickey, Larry wrote:
I have a blob structure which is primarily doubles. Is there anyone
with
some experience with doing data compression to make the blobs
smaller?
Tests I have
run so far indicate that compression is too slow on blobs of a
few meg to
be practical. I get now at least 20 to 40 inserts per second but
if a single compression
takes over a second, it's clearly not worth the trouble. Does
anybody have experience
with a compression scheme with blobs that consist of mostly arrays of
doubles?
Some schemes ( ibsen) offer lightening speed decompression so if the
database was primarily used to read, this would be good choice but
very
expensive to do
the compression required to make it.
-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------
-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------
-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------
-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------
-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------