At 6:04 PM -0300 7/5/06, Gussimulator wrote:
Now, since theres a lot of repetitive data, I thought that compressing the database would be a good idea, since, we all know.. One of the first principles of data compression is getting rid of repetitive data, so... I was wondering if this is possible with SQLite or it would be quite a pain to implement a compression scheme by myself?.. I have worked with many compression libraries before so that wouldnt be an issue, the issue however, would be to implement any of the libraries into SQLite...

First things first, what do you mean by "repetitive"?

Do you mean that there are many copies of the same data?

Perhaps a better approach is to normalize the database and just store single copies of things.

If you have tables with duplicate rows, then add a 'quantity' column and reduce to one copy of the actual data.

If some columns are unique and some are repeated, perhaps try splitting the tables into more tables that are related.

This, really, is what you should be doing first, and may very well be the only step you need.

If you can't do that, then please explain in what way the data is repetitive?

-- Darren Duncan

Reply via email to