On 29 Oct 2015, at 8:59pm, Jason H <jhihn at gmx.com> wrote:

> Currently, I have a SQLite database of around 10gig that takes 25 minutes to 
> run a single query against no other activity (it's never queried in 
> read/write, just read). I've created indexes the best I can.

You can do better.  I have a 43 Gigabyte database and I get responses from 
queries including JOINs in milliseconds.  It has four tables and the widest 
table has five columns.

> For this specific database, we join though about 12 tables on an average 
> query (519 tables total, 319 code tables), most of which have over 2 dozen 
> columns, some over 256 columns, max 384. the longest row is 16k in total.

There's your problem.  You need to redesign your schema.  As a general rule, if 
a database has so many tables -- or a table has so many columns -- that you 
can't keep them all in your head at the same time, it's badly-designed.  256 is 
silly.

You don't need a tool.  You need a better schema.

Simon.

Reply via email to