David Hamill wrote:
> John wrote:
>   
>> I quite like the idea of writing the tests first.
>>     
> The problem with this is that you don't know what to test 
> for until you've written the code.
>   

My employer recently implemented a new process for requirements. The 
executive summary is that we include everyone up front to define 
functional requirements and process flows: the customer, development, 
QA, and our requirements department. The customer signs off saying "this 
is what I want to pay you to do," development signs off to say "we can 
develop this and it will take X hours of work," QA signs off to say 
"this is testable," etc.

Once that is done, we write technical designs which are basically code 
stubs. We then write unit tests for each requirement in parallel for the 
code that will work with automated testing (i.e. not GUI stuff). At this 
point we have rock-solid requirements, a code design stubbed out by 
senior developers, and test cases. Any developer can then write the code 
and prove that it will work.

> Maybe some much more organised people than me can specify 
> exactly what they want upfront...

When you are writing software for a customer, this is typically a 
requirement. I have worked on projects where the customer continually 
changes their requirements because they do not know what they want. 
Invariably these are the projects that fail, where we go over our quoted 
hours and into the land of free labor.

> My personal solution is to write the tests and the code in 
> parallel, a few lines at a time. The way I exercise the code 
> is by running the tests.
>   

Despite the best of intentions, nobody can guarantee even with solid 
requirements and design that 100% of test cases can be written ahead of 
time. I find myself writing more as I implement requirements. I may 
refactor code or find myself writing new code that was not obvious 
during the initial requirements and design phases.

-- 
John Gaughan

Reply via email to