On Tue, Jan 29, 2002 at 08:32:54AM +0000, Tony Bowden ([EMAIL PROTECTED]) wrote:
> The first big lesson was:
> 
>   - automate whatever review standards you can
> 
> When we first introduced this most of the reviews were for very
> basic things: you forgot strict or warnings. You didn't untaint that
> variable. You're not following our coding standards there.

Ah, ok. So were you just grepping each file for "use strict", or what?

> So we built as much of that as possible into our build script to catch
> automatically. This hugely improved the feedback loop. A developer could
> get instant feedback on the sort of things that would get an immediate
> 'fail' from a reviewer.

Did people run that before their commits, or did they get mail after
they screwed up a commit?

> The 'customer' can also help with this in terms of acceptance testing
> etc. [At BlackStar we never really cracked the testing of web pages,
> leaving most of that for modules, but at Kasei we've developed a pretty
> good system for doing that that I keep meaning to write up ("useful uses
> of 'local' in testing")]. 

Your talking about this reminded me of a peeve I've had lately.
I need webpage testing that doesn't suck. In particular, I need
something like Inline::Webchat without its strange limitations. 
I also need more abstraction. I've got a lot of Apache handlers
written in Perl which talk to databases and make web-based editing
forms. I need to do things like "okay, find each link that looks
like http://host/edit/thingy/?foo=(number)&action=(something). Now
click through all those pages and run tests on each one of them."
Hand coding all those tests could really suck, so I haven't even
tried.

What I really want is something like the commercial web-testing
tools--- Segue Silktest and similar. Has anyone thought about
solving this problem?

srl
-- 
Shane Landrum  (srl AT boston DOT com)  Software Engineer, boston.com                  
   

Reply via email to