Michael G Schwern wrote:

Another route is to go the "lite" software route.  Instead of testing with
Apache and PostgreSQL, test with HTTP::Server::Simple and SQLite.  Easier to
install and configure.  The downside is you're not testing against your real
production environment so something still should test against a staging server.

You need to test against a staging server anyway, one to which developers have no access beyond that of the ordinary plebs.

If your software isn't platform agnostic, consider something like VMWare
images.  At this point I wave my hands like so and throw a ninja flash bomb
*POOF!*

FWIW, Qemu is far easier to get working, although slower. That's because unlike VMware it doesn't have special device drivers for doing stuff.

But regardless, some kind of virtualisation is the solution to the OP's problem. Move developers' testing from a dedicated test server to a test server running as an application on their desktop. That test server can be set up from a standard build.

The homogeneous environment is a seductive one.  Its simple and easy to
maintain.  You don't have to worry about different developers getting
different results because they're using different versions of the software.
You know that the machine the code was tested on and the production server
match because they're built the same way and there's only one to worry about.
But it is an inflexible and all-or-nothing approach.

Let's say you're using Perl 5.6.2.  This means EVERYONE is using 5.6.2.  Every
developer, every system one a single version of Perl.This means everyone is
coding for the same bugs, quirks and undocumented features of that particular
version of Perl.  As long as it all works on that one version nobody is
thinking there's anything wrong.  So everyone continues to write code with
subtle mistakes that are more and more specific to that version of Perl.

Solution: code for whatever the hell version of perl the developers have on their desktops. This will probably be something 5.8-ish because they're perl developers and like shiny new things. TEST against precisely the same build of perl as you've got in your live environment, which is also probably something 5.8-ish because you want funny foreign characters to work properly. You need to do that at some point anyway, no matter how careful you are.

It's not exactly hard to have several different builds of perl on one box and switch between them with a different shebang line, or by changing a link in the filesystem. The machine I use for testing my own stuff has 5.005_04, 5.6.2 and 5.8.8, as well as whatever the OS installed and uses for its own purposes.

The answer is you don't.  You rapidly downgrade so people can get work done.
Then maybe, if you're really dedicated, you come in after work, upgrade the
test server and fix as much as you can then downgrade again before anyone
comes into work the next day.  More likely you just never upgrade.  And than
you wake up one day to find yourself running Perl 5.5.4, MySQL 3.22 and Apache
1.3 all on a Redhat 7.2 box.  Deep at the bottom of a steep pit of upgrades.

Again not a problem. We have machines running stuff for old customers which run on old versions of the OS, old versions of Apache, old versions of perl, even old versions of *our* software. No need to change it because it's stable. It Just Works.

If we need to bugfix old stuff, we create a testing machine (actually a Xen instance these days) with the appropriate software (which we've archived in case it becomes unobtainium on the intertubes). If the customer wants new features then we'll decide on a case-by-case basis whether to add features using the old software, or port to a newer OS and infrastructure.

New customers get a newer OS, Apache, perl etc.  Not *the* newest of course.

--
David Cantrell

Reply via email to