--- Michael G Schwern <[EMAIL PROTECTED]> wrote:
> I'm about to do a sort of brute-force approach:
>
> * create and populate a database called "projectname_$user_$pid"
> * find an open port
> * write a config file with the port and database to use
> * fire up the server with that config
> * run the tests
> * shutdown the server
> * drop the database
>
> Seems a bit wasteful, so I'm wondering how other people handle it.
Just the db stuff:
When we start a new task, we branch and created a new test db named
something like $project_$branch_$user. We rarely drop this until we're
done with the branch or we need to recreate it due to corruption.
At the beginning of each test program, we do something like this (this
isn't quite accurate):
disable foreign keys
truncate all non-static tables (faster than delete)
re-enable foreign keys
if static-tables.changed
rebuild them
squeal like a pig
Depending on what you're doing, this has some benefits:
1. *Much* faster than constantly dropping/recreating the db.
2. When tests fail, the db is left with the data which can help you
recreate the failure.
3. Much better than rolling back changes because you're not altering
your code's behavior to suit tests.
It can be tricky to set up the first time, though, and if you've done
something stupid, like having a static table depend on a non-static one
(FK constraint), then you deserve for it to fail :)
Static tables: things like ISO country codes.
Dynamic tables: orders.
Cheers,
Ovid
--
Buy the book - http://www.oreilly.com/catalog/perlhks/
Perl and CGI - http://users.easystreet.com/ovid/cgi_course/
Personal blog - http://publius-ovidius.livejournal.com/
Tech blog - http://use.perl.org/~Ovid/journal/