Frank,

>   Now that's rich.  I don't think I've ever seen a database perform
>   worse after it was normalized.  In fact, I can't even think of a
>   situation where it could!

Oh, there are some.    For example, Primer's issues around his dating 
database; it turned out that a fully normalized design resulted in very bad 
select performance because of the number of joins involved.  Of course, the 
method that did perform well was *not* a simple denormalization, either.

The issue with denormalization is, I think, that a lot of developers cut their 
teeth on the likes of MS Access, Sybase 2 or Informix 1.0, where a 
poor-performing join often didn't complete at all.   As a result, they got 
into the habit of "preemptive tuning"; that is, doing things "for performance 
reasons" when the system was still in the design phase, before they even know 
what the performance issues *were*.     

Not that this was a good practice even then, but the average software project 
allocates grossly inadequate time for testing, so you can see how it became a 
bad habit.  And most younger DBAs learn their skills on the job from the 
older DBAs, so the misinformation gets passed down.

-- 
Josh Berkus
Aglio Database Solutions
San Francisco

---------------------------(end of broadcast)---------------------------
TIP 4: Don't 'kill -9' the postmaster

Reply via email to