Please pardon the cross-posting.

A small group of us on the Performance list were discussing the first steps 
toward constructing a comprehensive Postgresql installation benchmarking 
tool, mostly to compare different operating systems and file systemsm but 
later to be used as a foundation for a tuning wizard.   

To do this, we need one or more real (not randomly generated*) medium-large 
database which is or can be BSD-licensed (data AND schema).   This database 
must have:

1) At least one "main" table with 12+ columns and 100,000+ rows (each).
2) At least 10-12 additional tables of assorted sizes, at least half of which 
should have Foriegn Key relationships to the main table(s) or each other.
3) At least one large text or varchar field among the various tables.

In addition, the following items would be helpful, but are not required:
4) Views, triggers, and functions built on the database
5) A query log of database activity to give us sample queries to work with.
6) Some complex data types, such as geometric, network, and/or custom data 

Thanks for any leads you can give me!

(* To forestall knee-jerk responses:  Randomly generated data does not look or 
perform the same as real data in my professional opinion, and I'm the one 
writing the test scripts.)

-Josh Berkus
 Aglio Database Solutions
 San Francisco

---------------------------(end of broadcast)---------------------------
TIP 1: subscribe and unsubscribe commands go to [EMAIL PROTECTED]

Reply via email to